CLL patients ‘cured’: 10 years post infusion, CAR T cells persist

Article Type
Changed
Fri, 12/16/2022 - 11:27

Two patients with chronic lymphocytic leukemia (CLL) who 10 years ago were among the first to receive groundbreaking chimeric antigen receptor T-cell therapy were still in remission a decade later, and they continued to show detectable levels of CAR T cells.

“We can now conclude that CAR T cells can actually cure patients with leukemia based on these results,” said senior author Carl H. June, MD, in a press briefing on the study published in Nature.

Dr. Carl H. June

“The major finding from this paper is that, 10 years down the road, you can find these [CAR T] cells,” Dr. June, director of the Center for Cellular Immunotherapies, University of Pennsylvania, Philadelphia, added. “The cells have evolved, and that was a big surprise ... but they are still able to kill leukemia cells 10 years after infusion.”

CAR T-cell therapy, in which patients’ own T cells are removed, reprogrammed in a lab to recognize and attack cancer cells, and then infused back into the patients, has transformed treatment of various blood cancers and shows often-remarkable results in achieving remissions.

While the treatment has become a routine therapy for certain leukemias, long-term results on the fate and function of the cells over time has been highly anticipated.

In the first published observations of a 10-year follow-up of patients treated with CAR T cells, Dr. June and colleagues described the findings for two patients, both with CLL, who back in 2010 were among the first to be treated with this groundbreaking therapy at the University of Pennsylvania.

A decade later, the CAR T cells are found to have remained detectable in both patients, who achieved complete remission in their first year of treatment, and both have sustained that remission.

Notably, the cells have evolved over the years – from initially being dominated by killer T cells to being dominated primarily by proliferative CD4-positive CAR T cells – with one of the patients exclusively having CD4-positive cells at year 9.3.

Dr. J. Joseph Melenhorst

“The killer T cells did the initial heavy lifting of eliminating the tumor, “ first author J. Joseph Melenhorst, PhD, said in an interview.

“Once their job was done, those cells went down to very low levels, but the CD4-positive population persisted,” said Dr. Melenhorst, who established the lab at the University of Pennsylvania to follow patients treated with CAR T-cell therapy. “[This] delayed phase of immune response against cancer is a novel insight, and we were surprised to see it.”

Dr. Melenhorst noted that the clonal makeup of the CD4-positive cells importantly stabilized and became dominated by a small number of clones, suggesting further sustainability.

When one of the two patients, Doug Olson, who participated in the press conference, donated his cells back to the center after 9.3 years, the researchers found that his cells were still capable of destroying leukemia cells in the lab.

“Ten years [post infusion], we can’t find any of the leukemia cells and we still have the CAR T cells that are on patrol and on surveillance for residual leukemia,” Dr. June said.

One challenge of the otherwise desirable elimination of leukemia cells is that some aspects of sustaining CAR T-cell activity become problematic.

“The aspect of how the remission is maintained [is] very hard to study in a patient when there is no leukemia at all,” Dr. June explained. “It could be the last cell was gone within 3 weeks [of treatment], or it could be that the [cancer cells] are coming up like whack-a-moles, and they are killed because these CAR T cells are on patrol.”

Sadly, the other CLL patient, Bill Ludwig, who was first to receive the CAR T-cell treatment, died in 2021 from COVID-19.
 

 

 

Effects in other blood diseases similar?

CAR T-cell therapy is currently approved in the United States for several blood cancers, and whether similar long-term patterns of the cells may be observed in other patient and cancer types remains to be seen, Dr. Melenhorst said.

“I think in CLL we will see something similar, but in other diseases, we have yet to learn,” he said. “It may depend on issues including which domain has been engineered into the CAR.”

While the prospect of some patients being “cured” is exciting, responses to the therapy have generally been mixed. In CLL, for instance, full remissions have been observed to be maintained in about a quarter of patients, with higher rates observed in some lymphomas and pediatric ALL patients, Dr. Melenhorst explained.

The effects of CAR T-cell therapy in solid cancers have so far been more disappointing, with no research centers reproducing the kinds of results that have been seen with blood cancers.

“There appear to be a number of reasons, including that the [solid] tumor is more complex, and these solid cancers have ways to evade the immune system that need to be overcome,” Dr. June explained.

And despite the more encouraging findings in blood cancers, even with those, “the biggest disappointment is that CAR T-cell therapy doesn’t work all the time. It doesn’t work in every patient,” coauthor David Porter, MD, the University of Pennsylvania oncologist who treated the two patients, said in the press briefing.

“I think the importance of the Nature study is that we are starting to learn the mechanisms of why and how this works, so that we can start to get at how to make it work for more people,” Dr. Porter added. “But what we do see is that, when it works, it really is beyond what we expected 10 or 11 years ago.”

Speaking in the press briefing, Mr. Olson described how several weeks after his treatment in 2010, he became very ill with what has become known as the common, short-term side effect of cytokine release syndrome.

However, after Mr. Olson recovered a few days later, Dr. Porter gave him the remarkable news that “we cannot find a single cancer cell. You appear completely free of CLL.”

Mr. Olson reported that he has since lived a “full life,” kept working, and has even run some half-marathons.

Dr. June confided that the current 10-year results far exceed the team’s early expectations for CAR T-cell therapy. “After Doug [initially] signed his informed consent document for this, we thought that the cells would all be gone within a month or 2. The fact that they have survived for 10 years was a major surprise – and a happy one at that.”

Dr. June, Dr. Melenhorst, and Dr. Porter reported holding patents related to CAR T-cell manufacturing and biomarker discovery.

Publications
Topics
Sections

Two patients with chronic lymphocytic leukemia (CLL) who 10 years ago were among the first to receive groundbreaking chimeric antigen receptor T-cell therapy were still in remission a decade later, and they continued to show detectable levels of CAR T cells.

“We can now conclude that CAR T cells can actually cure patients with leukemia based on these results,” said senior author Carl H. June, MD, in a press briefing on the study published in Nature.

Dr. Carl H. June

“The major finding from this paper is that, 10 years down the road, you can find these [CAR T] cells,” Dr. June, director of the Center for Cellular Immunotherapies, University of Pennsylvania, Philadelphia, added. “The cells have evolved, and that was a big surprise ... but they are still able to kill leukemia cells 10 years after infusion.”

CAR T-cell therapy, in which patients’ own T cells are removed, reprogrammed in a lab to recognize and attack cancer cells, and then infused back into the patients, has transformed treatment of various blood cancers and shows often-remarkable results in achieving remissions.

While the treatment has become a routine therapy for certain leukemias, long-term results on the fate and function of the cells over time has been highly anticipated.

In the first published observations of a 10-year follow-up of patients treated with CAR T cells, Dr. June and colleagues described the findings for two patients, both with CLL, who back in 2010 were among the first to be treated with this groundbreaking therapy at the University of Pennsylvania.

A decade later, the CAR T cells are found to have remained detectable in both patients, who achieved complete remission in their first year of treatment, and both have sustained that remission.

Notably, the cells have evolved over the years – from initially being dominated by killer T cells to being dominated primarily by proliferative CD4-positive CAR T cells – with one of the patients exclusively having CD4-positive cells at year 9.3.

Dr. J. Joseph Melenhorst

“The killer T cells did the initial heavy lifting of eliminating the tumor, “ first author J. Joseph Melenhorst, PhD, said in an interview.

“Once their job was done, those cells went down to very low levels, but the CD4-positive population persisted,” said Dr. Melenhorst, who established the lab at the University of Pennsylvania to follow patients treated with CAR T-cell therapy. “[This] delayed phase of immune response against cancer is a novel insight, and we were surprised to see it.”

Dr. Melenhorst noted that the clonal makeup of the CD4-positive cells importantly stabilized and became dominated by a small number of clones, suggesting further sustainability.

When one of the two patients, Doug Olson, who participated in the press conference, donated his cells back to the center after 9.3 years, the researchers found that his cells were still capable of destroying leukemia cells in the lab.

“Ten years [post infusion], we can’t find any of the leukemia cells and we still have the CAR T cells that are on patrol and on surveillance for residual leukemia,” Dr. June said.

One challenge of the otherwise desirable elimination of leukemia cells is that some aspects of sustaining CAR T-cell activity become problematic.

“The aspect of how the remission is maintained [is] very hard to study in a patient when there is no leukemia at all,” Dr. June explained. “It could be the last cell was gone within 3 weeks [of treatment], or it could be that the [cancer cells] are coming up like whack-a-moles, and they are killed because these CAR T cells are on patrol.”

Sadly, the other CLL patient, Bill Ludwig, who was first to receive the CAR T-cell treatment, died in 2021 from COVID-19.
 

 

 

Effects in other blood diseases similar?

CAR T-cell therapy is currently approved in the United States for several blood cancers, and whether similar long-term patterns of the cells may be observed in other patient and cancer types remains to be seen, Dr. Melenhorst said.

“I think in CLL we will see something similar, but in other diseases, we have yet to learn,” he said. “It may depend on issues including which domain has been engineered into the CAR.”

While the prospect of some patients being “cured” is exciting, responses to the therapy have generally been mixed. In CLL, for instance, full remissions have been observed to be maintained in about a quarter of patients, with higher rates observed in some lymphomas and pediatric ALL patients, Dr. Melenhorst explained.

The effects of CAR T-cell therapy in solid cancers have so far been more disappointing, with no research centers reproducing the kinds of results that have been seen with blood cancers.

“There appear to be a number of reasons, including that the [solid] tumor is more complex, and these solid cancers have ways to evade the immune system that need to be overcome,” Dr. June explained.

And despite the more encouraging findings in blood cancers, even with those, “the biggest disappointment is that CAR T-cell therapy doesn’t work all the time. It doesn’t work in every patient,” coauthor David Porter, MD, the University of Pennsylvania oncologist who treated the two patients, said in the press briefing.

“I think the importance of the Nature study is that we are starting to learn the mechanisms of why and how this works, so that we can start to get at how to make it work for more people,” Dr. Porter added. “But what we do see is that, when it works, it really is beyond what we expected 10 or 11 years ago.”

Speaking in the press briefing, Mr. Olson described how several weeks after his treatment in 2010, he became very ill with what has become known as the common, short-term side effect of cytokine release syndrome.

However, after Mr. Olson recovered a few days later, Dr. Porter gave him the remarkable news that “we cannot find a single cancer cell. You appear completely free of CLL.”

Mr. Olson reported that he has since lived a “full life,” kept working, and has even run some half-marathons.

Dr. June confided that the current 10-year results far exceed the team’s early expectations for CAR T-cell therapy. “After Doug [initially] signed his informed consent document for this, we thought that the cells would all be gone within a month or 2. The fact that they have survived for 10 years was a major surprise – and a happy one at that.”

Dr. June, Dr. Melenhorst, and Dr. Porter reported holding patents related to CAR T-cell manufacturing and biomarker discovery.

Two patients with chronic lymphocytic leukemia (CLL) who 10 years ago were among the first to receive groundbreaking chimeric antigen receptor T-cell therapy were still in remission a decade later, and they continued to show detectable levels of CAR T cells.

“We can now conclude that CAR T cells can actually cure patients with leukemia based on these results,” said senior author Carl H. June, MD, in a press briefing on the study published in Nature.

Dr. Carl H. June

“The major finding from this paper is that, 10 years down the road, you can find these [CAR T] cells,” Dr. June, director of the Center for Cellular Immunotherapies, University of Pennsylvania, Philadelphia, added. “The cells have evolved, and that was a big surprise ... but they are still able to kill leukemia cells 10 years after infusion.”

CAR T-cell therapy, in which patients’ own T cells are removed, reprogrammed in a lab to recognize and attack cancer cells, and then infused back into the patients, has transformed treatment of various blood cancers and shows often-remarkable results in achieving remissions.

While the treatment has become a routine therapy for certain leukemias, long-term results on the fate and function of the cells over time has been highly anticipated.

In the first published observations of a 10-year follow-up of patients treated with CAR T cells, Dr. June and colleagues described the findings for two patients, both with CLL, who back in 2010 were among the first to be treated with this groundbreaking therapy at the University of Pennsylvania.

A decade later, the CAR T cells are found to have remained detectable in both patients, who achieved complete remission in their first year of treatment, and both have sustained that remission.

Notably, the cells have evolved over the years – from initially being dominated by killer T cells to being dominated primarily by proliferative CD4-positive CAR T cells – with one of the patients exclusively having CD4-positive cells at year 9.3.

Dr. J. Joseph Melenhorst

“The killer T cells did the initial heavy lifting of eliminating the tumor, “ first author J. Joseph Melenhorst, PhD, said in an interview.

“Once their job was done, those cells went down to very low levels, but the CD4-positive population persisted,” said Dr. Melenhorst, who established the lab at the University of Pennsylvania to follow patients treated with CAR T-cell therapy. “[This] delayed phase of immune response against cancer is a novel insight, and we were surprised to see it.”

Dr. Melenhorst noted that the clonal makeup of the CD4-positive cells importantly stabilized and became dominated by a small number of clones, suggesting further sustainability.

When one of the two patients, Doug Olson, who participated in the press conference, donated his cells back to the center after 9.3 years, the researchers found that his cells were still capable of destroying leukemia cells in the lab.

“Ten years [post infusion], we can’t find any of the leukemia cells and we still have the CAR T cells that are on patrol and on surveillance for residual leukemia,” Dr. June said.

One challenge of the otherwise desirable elimination of leukemia cells is that some aspects of sustaining CAR T-cell activity become problematic.

“The aspect of how the remission is maintained [is] very hard to study in a patient when there is no leukemia at all,” Dr. June explained. “It could be the last cell was gone within 3 weeks [of treatment], or it could be that the [cancer cells] are coming up like whack-a-moles, and they are killed because these CAR T cells are on patrol.”

Sadly, the other CLL patient, Bill Ludwig, who was first to receive the CAR T-cell treatment, died in 2021 from COVID-19.
 

 

 

Effects in other blood diseases similar?

CAR T-cell therapy is currently approved in the United States for several blood cancers, and whether similar long-term patterns of the cells may be observed in other patient and cancer types remains to be seen, Dr. Melenhorst said.

“I think in CLL we will see something similar, but in other diseases, we have yet to learn,” he said. “It may depend on issues including which domain has been engineered into the CAR.”

While the prospect of some patients being “cured” is exciting, responses to the therapy have generally been mixed. In CLL, for instance, full remissions have been observed to be maintained in about a quarter of patients, with higher rates observed in some lymphomas and pediatric ALL patients, Dr. Melenhorst explained.

The effects of CAR T-cell therapy in solid cancers have so far been more disappointing, with no research centers reproducing the kinds of results that have been seen with blood cancers.

“There appear to be a number of reasons, including that the [solid] tumor is more complex, and these solid cancers have ways to evade the immune system that need to be overcome,” Dr. June explained.

And despite the more encouraging findings in blood cancers, even with those, “the biggest disappointment is that CAR T-cell therapy doesn’t work all the time. It doesn’t work in every patient,” coauthor David Porter, MD, the University of Pennsylvania oncologist who treated the two patients, said in the press briefing.

“I think the importance of the Nature study is that we are starting to learn the mechanisms of why and how this works, so that we can start to get at how to make it work for more people,” Dr. Porter added. “But what we do see is that, when it works, it really is beyond what we expected 10 or 11 years ago.”

Speaking in the press briefing, Mr. Olson described how several weeks after his treatment in 2010, he became very ill with what has become known as the common, short-term side effect of cytokine release syndrome.

However, after Mr. Olson recovered a few days later, Dr. Porter gave him the remarkable news that “we cannot find a single cancer cell. You appear completely free of CLL.”

Mr. Olson reported that he has since lived a “full life,” kept working, and has even run some half-marathons.

Dr. June confided that the current 10-year results far exceed the team’s early expectations for CAR T-cell therapy. “After Doug [initially] signed his informed consent document for this, we thought that the cells would all be gone within a month or 2. The fact that they have survived for 10 years was a major surprise – and a happy one at that.”

Dr. June, Dr. Melenhorst, and Dr. Porter reported holding patents related to CAR T-cell manufacturing and biomarker discovery.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NATURE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

CDC issues new pneumococcal vaccine recommendations for adults

Article Type
Changed
Wed, 02/09/2022 - 13:04

 

Updated pneumococcal vaccine recommendations for adults from the Centers for Disease Control and Prevention call for the use of the two recently approved vaccines in a more streamlined approach to avoid the complexities of age and patient conditions that hindered previous recommendations.

The recommendations, voted on by the CDC’s Advisory Committee on Immunization Practices (ACIP) in October and made final in January with publication in the agency’s Morbidity and Mortality Weekly Report (MMWR), call for use of the 15-valent pneumococcal conjugate vaccine (PCV15; Vaxneuvance, Merck Sharp & Dohme) or 20-valent PCV (PREVNAR20; Wyeth Pharmaceuticals).

The recommendations apply to PCV-naive adults in the United States who are either aged 65 years or older, or who are aged 19-64 years and have underlying conditions such as diabetes, chronic heart or liver disease, or HIV, and have not previously received a PCV or whose previous vaccination history is unknown.

If the PCV15 vaccine is used, a subsequent dose of the 23-valent pneumococcal polysaccharide vaccine (PPSV23; Pneumovax23, Merck Sharp & Dohme) should be provided, typically at least 1 year later, under the recommendations.

As reported by this news organization, PCV15 and PREVNAR20 received approval from the Food and Drug Administration last July.

Those approvals provided an impetus for the revised recommendations, “offer[ing] an opportunity to review the existing recommendations and available data,” Miwako Kobayashi, MD, first author of the MMWR report and a medical epidemiologist with the National Center for Immunization and Respiratory Diseases, CDC, in Atlanta, said in an interview.

“As part of that process, ACIP strived to simplify the recommendations,” she said.

The previous recommendations called for the PCV13 vaccine and the PPSV23 and had varying conditions (depending on certain age and risk groups) that added complexity to the process. Under the new approach, the same recommendation applies regardless of specific medical conditions or other risk factors.

“With the simplified recommendation for adults 19 through 64, we expect coverage may increase among this population,” Dr. Kobayashi said.

Compared with the PCV13 vaccine, PREVNAR20 protects against seven additional serotypes involved in cases of invasive pneumococcal disease (IPD) and pneumonia, which are responsible for up to 40% of all cases of pneumococcal disease and related deaths in the United States.

While the PREVNAR20 includes five more pneumococcal serotypes than PCV15, the

CDC does not recommend one over the other, Dr. Kobayashi noted.

More than 90% of cases of adult IPD involve older adults and adults with chronic medical conditions or immunocompromising conditions, cerebrospinal fluid leaks, or cochlear implants, the MMWR report notes.

Commenting on the recommendations, Amit A. Shah, MD, a geriatrician with the Mayo Clinic in Phoenix, Ariz., underscored the need for clinicians to be proactive in recommending the vaccines to those patients.

“Despite only needing one vaccine dose after turning 65 to be considered vaccinated, only about 70% of people in this group have received any pneumococcal vaccination,” he said in an interview. “This percentage has not increased much over the past several years.”

The new approach should help change that, he said.

“These new recommendations are a significant simplification from the prior confusing and challenging-to-implement recommendations from 2019,” Dr. Shah explained.

Among the 2019 recommendations was a stipulation for “shared decision-making” with PCV13, and a conversation that often only complicated matters, he noted.

“Patients and providers alike had confusion about this since it was not a clear-cut ‘yes, give it’ or ‘no, do not give it any longer’ recommendation.”

“Now that this new recommendation will require no extra time for a discussion in the clinic, and just a simple ‘it’s time for your pneumonia shot’ offer, this may become more feasible,” Dr. Shah added. “In addition, removal of the shared decision-making stipulation allows for this immunization to be easily protocolized in the clinic, similar to automatic offers to the flu vaccine for patients each year.”

According to the CDC, pneumococcal pneumonia causes an estimated 150,000 hospitalizations each year in the United States, while pneumococcal meningitis and bacteremia killed approximately 3,250 people in the United States in 2019.

“Clinicians are patients’ most trusted resource when it comes to vaccine recommendations,” Dr. Kobayashi said. “We encourage all clinicians to recommend pneumococcal vaccines when indicated.”

Dr. Kobayashi and Dr. Shah have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Updated pneumococcal vaccine recommendations for adults from the Centers for Disease Control and Prevention call for the use of the two recently approved vaccines in a more streamlined approach to avoid the complexities of age and patient conditions that hindered previous recommendations.

The recommendations, voted on by the CDC’s Advisory Committee on Immunization Practices (ACIP) in October and made final in January with publication in the agency’s Morbidity and Mortality Weekly Report (MMWR), call for use of the 15-valent pneumococcal conjugate vaccine (PCV15; Vaxneuvance, Merck Sharp & Dohme) or 20-valent PCV (PREVNAR20; Wyeth Pharmaceuticals).

The recommendations apply to PCV-naive adults in the United States who are either aged 65 years or older, or who are aged 19-64 years and have underlying conditions such as diabetes, chronic heart or liver disease, or HIV, and have not previously received a PCV or whose previous vaccination history is unknown.

If the PCV15 vaccine is used, a subsequent dose of the 23-valent pneumococcal polysaccharide vaccine (PPSV23; Pneumovax23, Merck Sharp & Dohme) should be provided, typically at least 1 year later, under the recommendations.

As reported by this news organization, PCV15 and PREVNAR20 received approval from the Food and Drug Administration last July.

Those approvals provided an impetus for the revised recommendations, “offer[ing] an opportunity to review the existing recommendations and available data,” Miwako Kobayashi, MD, first author of the MMWR report and a medical epidemiologist with the National Center for Immunization and Respiratory Diseases, CDC, in Atlanta, said in an interview.

“As part of that process, ACIP strived to simplify the recommendations,” she said.

The previous recommendations called for the PCV13 vaccine and the PPSV23 and had varying conditions (depending on certain age and risk groups) that added complexity to the process. Under the new approach, the same recommendation applies regardless of specific medical conditions or other risk factors.

“With the simplified recommendation for adults 19 through 64, we expect coverage may increase among this population,” Dr. Kobayashi said.

Compared with the PCV13 vaccine, PREVNAR20 protects against seven additional serotypes involved in cases of invasive pneumococcal disease (IPD) and pneumonia, which are responsible for up to 40% of all cases of pneumococcal disease and related deaths in the United States.

While the PREVNAR20 includes five more pneumococcal serotypes than PCV15, the

CDC does not recommend one over the other, Dr. Kobayashi noted.

More than 90% of cases of adult IPD involve older adults and adults with chronic medical conditions or immunocompromising conditions, cerebrospinal fluid leaks, or cochlear implants, the MMWR report notes.

Commenting on the recommendations, Amit A. Shah, MD, a geriatrician with the Mayo Clinic in Phoenix, Ariz., underscored the need for clinicians to be proactive in recommending the vaccines to those patients.

“Despite only needing one vaccine dose after turning 65 to be considered vaccinated, only about 70% of people in this group have received any pneumococcal vaccination,” he said in an interview. “This percentage has not increased much over the past several years.”

The new approach should help change that, he said.

“These new recommendations are a significant simplification from the prior confusing and challenging-to-implement recommendations from 2019,” Dr. Shah explained.

Among the 2019 recommendations was a stipulation for “shared decision-making” with PCV13, and a conversation that often only complicated matters, he noted.

“Patients and providers alike had confusion about this since it was not a clear-cut ‘yes, give it’ or ‘no, do not give it any longer’ recommendation.”

“Now that this new recommendation will require no extra time for a discussion in the clinic, and just a simple ‘it’s time for your pneumonia shot’ offer, this may become more feasible,” Dr. Shah added. “In addition, removal of the shared decision-making stipulation allows for this immunization to be easily protocolized in the clinic, similar to automatic offers to the flu vaccine for patients each year.”

According to the CDC, pneumococcal pneumonia causes an estimated 150,000 hospitalizations each year in the United States, while pneumococcal meningitis and bacteremia killed approximately 3,250 people in the United States in 2019.

“Clinicians are patients’ most trusted resource when it comes to vaccine recommendations,” Dr. Kobayashi said. “We encourage all clinicians to recommend pneumococcal vaccines when indicated.”

Dr. Kobayashi and Dr. Shah have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Updated pneumococcal vaccine recommendations for adults from the Centers for Disease Control and Prevention call for the use of the two recently approved vaccines in a more streamlined approach to avoid the complexities of age and patient conditions that hindered previous recommendations.

The recommendations, voted on by the CDC’s Advisory Committee on Immunization Practices (ACIP) in October and made final in January with publication in the agency’s Morbidity and Mortality Weekly Report (MMWR), call for use of the 15-valent pneumococcal conjugate vaccine (PCV15; Vaxneuvance, Merck Sharp & Dohme) or 20-valent PCV (PREVNAR20; Wyeth Pharmaceuticals).

The recommendations apply to PCV-naive adults in the United States who are either aged 65 years or older, or who are aged 19-64 years and have underlying conditions such as diabetes, chronic heart or liver disease, or HIV, and have not previously received a PCV or whose previous vaccination history is unknown.

If the PCV15 vaccine is used, a subsequent dose of the 23-valent pneumococcal polysaccharide vaccine (PPSV23; Pneumovax23, Merck Sharp & Dohme) should be provided, typically at least 1 year later, under the recommendations.

As reported by this news organization, PCV15 and PREVNAR20 received approval from the Food and Drug Administration last July.

Those approvals provided an impetus for the revised recommendations, “offer[ing] an opportunity to review the existing recommendations and available data,” Miwako Kobayashi, MD, first author of the MMWR report and a medical epidemiologist with the National Center for Immunization and Respiratory Diseases, CDC, in Atlanta, said in an interview.

“As part of that process, ACIP strived to simplify the recommendations,” she said.

The previous recommendations called for the PCV13 vaccine and the PPSV23 and had varying conditions (depending on certain age and risk groups) that added complexity to the process. Under the new approach, the same recommendation applies regardless of specific medical conditions or other risk factors.

“With the simplified recommendation for adults 19 through 64, we expect coverage may increase among this population,” Dr. Kobayashi said.

Compared with the PCV13 vaccine, PREVNAR20 protects against seven additional serotypes involved in cases of invasive pneumococcal disease (IPD) and pneumonia, which are responsible for up to 40% of all cases of pneumococcal disease and related deaths in the United States.

While the PREVNAR20 includes five more pneumococcal serotypes than PCV15, the

CDC does not recommend one over the other, Dr. Kobayashi noted.

More than 90% of cases of adult IPD involve older adults and adults with chronic medical conditions or immunocompromising conditions, cerebrospinal fluid leaks, or cochlear implants, the MMWR report notes.

Commenting on the recommendations, Amit A. Shah, MD, a geriatrician with the Mayo Clinic in Phoenix, Ariz., underscored the need for clinicians to be proactive in recommending the vaccines to those patients.

“Despite only needing one vaccine dose after turning 65 to be considered vaccinated, only about 70% of people in this group have received any pneumococcal vaccination,” he said in an interview. “This percentage has not increased much over the past several years.”

The new approach should help change that, he said.

“These new recommendations are a significant simplification from the prior confusing and challenging-to-implement recommendations from 2019,” Dr. Shah explained.

Among the 2019 recommendations was a stipulation for “shared decision-making” with PCV13, and a conversation that often only complicated matters, he noted.

“Patients and providers alike had confusion about this since it was not a clear-cut ‘yes, give it’ or ‘no, do not give it any longer’ recommendation.”

“Now that this new recommendation will require no extra time for a discussion in the clinic, and just a simple ‘it’s time for your pneumonia shot’ offer, this may become more feasible,” Dr. Shah added. “In addition, removal of the shared decision-making stipulation allows for this immunization to be easily protocolized in the clinic, similar to automatic offers to the flu vaccine for patients each year.”

According to the CDC, pneumococcal pneumonia causes an estimated 150,000 hospitalizations each year in the United States, while pneumococcal meningitis and bacteremia killed approximately 3,250 people in the United States in 2019.

“Clinicians are patients’ most trusted resource when it comes to vaccine recommendations,” Dr. Kobayashi said. “We encourage all clinicians to recommend pneumococcal vaccines when indicated.”

Dr. Kobayashi and Dr. Shah have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE MMWR

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Vitamin D shows no survival benefit in nondeficient elderly

Article Type
Changed
Wed, 02/02/2022 - 14:33

Monthly supplementation with vitamin D3 (cholecalciferol) in older adults without deficiency has no significant benefit in terms of survival outcomes, including mortality linked to cardiovascular disease, new results from a large, placebo-controlled trial show.

“The take-home message is that routine vitamin D supplementation, irrespective of the dosing regimen, is unlikely to be beneficial in a population with a low prevalence of vitamin D deficiency,” first author Rachel E. Neale, PhD, of the Population Health Department, QIMR Berghofer Medical Research Institute, in Brisbane, Australia, told this news organization.

Zbynek Pospisil/Getty Images

Despite extensive previous research on vitamin D supplementation, “mortality has not been the primary outcome in any previous large trial of high-dose vitamin D supplementation,” Dr. Neale and coauthors noted. The results, published online in Lancet Diabetes & Endocrinology, are from the D-Health trial.

With more than 20,000 participants, this is the largest intermittent-dosing trial to date, the authors noted. The primary outcome was all-cause mortality.

In an accompanying editorial, Inez Schoenmakers, PhD, noted that “the findings [are] highly relevant for population policy, owing to the study’s population-based design, large scale, and long duration.”

This new “research contributes to the concept that improving vitamin D status with supplementation in a mostly vitamin D-replete older population does not influence all-cause mortality,” Dr. Schoenmakers, of the Faculty of Medicine and Health Sciences, University of East Anglia, Norwich, England, said in an interview.

“This is not dissimilar to research with many other nutrients showing that increasing intake above the adequate intake has no further health benefits,” she added.
 

D-Health Trial

The D-Health Trial involved 21,315 participants in Australia, enrolled between February 2014 and June 2015, who had not been screened for vitamin D deficiency but were largely considered to be vitamin D replete. They were a mean age of 69.3 years and 54% were men.

Participants were randomized 1:1 to a once-monthly oral vitamin D3 supplementation of 60,000 IU (n = 10,662) or a placebo capsule (n = 10,653).

They were permitted to take up to 2,000 IU/day of supplemental vitamin D in addition to the study protocol and had no history of kidney stones, hypercalcemia, hyperparathyroidism, osteomalacia, or sarcoidosis.

Over a median follow-up of 5.7 years, there were 1,100 deaths: 562 in the vitamin D group (5.3%) and 538 in the placebo group (5.1%). With a hazard ratio (HR) for all-cause mortality of 1.04, the difference was not significant (P = .47).

There were also no significant differences in terms of mortality from cardiovascular disease (HR, 0.96; P = .77), cancer (HR, 1.15; P = .13), or other causes (HR, 0.83; P = .15).

Rates of total adverse events between the two groups, including hypercalcemia and kidney stones, were similar.

An exploratory analysis excluding the first 2 years of follow-up in fact showed a numerically higher hazard ratio for cancer mortality in the vitamin D group versus no supplementation (HR, 1.24; P = .05). However, the authors noted that the effect was “not apparent when the analysis was restricted to deaths that were coded by the study team and not officially coded.”

Nevertheless, “our findings, from a large study in an unscreened population, give pause to earlier reports that vitamin D supplements might reduce cancer mortality,” they underscored.

Retention and adherence in the study were high, each exceeding 80%. Although blood samples were not collected at baseline, samples from 3,943 randomly sampled participants during follow-up showed mean serum 25-hydroxy-vitamin D concentrations of 77 nmol/L in the placebo group and 115 nmol/L in the vitamin D group, both within the normal range of 50-125 nmol/L.
 

 

 

Findings supported by previous research

The trial results are consistent with those of prior large studies and meta-analyses of older adults with a low prevalence of vitamin D deficiency showing that vitamin D3 supplementation, regardless of whether taken daily or monthly, is not likely to have an effect on all-cause mortality.

In the US VITAL trial, recently published in the New England Journal of Medicine, among 25,871 participants administered 2,000 IU/day of vitamin D3 for a median of 5.3 years, there was no reduction in all-cause mortality.

The ViDA trial of 5,110 older adults in New Zealand, published in 2019 in the Journal of Endocrinological Investigation, also showed monthly vitamin D3 supplementation of 100,000 IU for a median of 3.3 years was not associated with a benefit in people who were not deficient.

“In total, the results from the large trials and meta-analyses suggest that routine supplementation of older adults in populations with a low prevalence of vitamin D deficiency is unlikely to reduce the rate of all-cause mortality,” Dr. Neale and colleagues concluded.
 

Longer-term supplementation beneficial?

The population was limited to older adults and the study had a relatively short follow-up period, which Dr. Neale noted was necessary for pragmatic reasons.

“Our primary outcome was all-cause mortality, so to have sufficient deaths we either needed to study older adults or a much larger sample of younger adults,” she explained.

“However, we felt that [the former] ... had biological justification, as there is evidence that vitamin D plays a role later in the course of a number of diseases, with potential impacts on mortality.”

She noted that recent studies evaluating genetically predicted concentrations of serum 25(OH)D have further shown no link between those levels and all-cause mortality, stroke, or coronary heart disease.

“This confirms the statement that vitamin D is unlikely to be beneficial in people who are not vitamin D deficient, irrespective of whether supplementation occurs over the short or longer term,” Dr. Neale said.

The source of vitamin D, itself, is another consideration, with ongoing speculation of differences in benefits between dietary or supplementation sources versus sunlight exposure.

“Exposure to ultraviolet radiation, for which serum 25(OH)D concentration is a good marker, might confer benefits not mediated by vitamin D,” Dr. Neale and coauthors noted.

They added that the results in the older Australian population “cannot be generalized to populations with a higher prevalence of vitamin D deficiency, or with a greater proportion of people not of White ancestry, than the study population.”

Ten-year mortality rates from the D-Health trial are expected to be reported in the future.
 

Strategies still needed to address vitamin D deficiency

Further commenting on the findings, Dr. Schoenmakers underscored that “vitamin D deficiency is very common worldwide, [and] more should be done to develop strategies to address the needs of those groups and populations that are at risk of the consequences of vitamin D deficiency.”

That said, the D-Health study is important in helping to distinguish when supplementation may – and may not – be of benefit, she noted.

“This and other research in the past 15 years have contributed to our understanding [of] what the ranges of vitamin D status are [in which] health consequences may be anticipated.”

The D-Health Trial was funded by the National Health and Medical Research Council. Dr. Neale and Dr. Schoenmakers have reported no relevant financial relationships. 


version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Monthly supplementation with vitamin D3 (cholecalciferol) in older adults without deficiency has no significant benefit in terms of survival outcomes, including mortality linked to cardiovascular disease, new results from a large, placebo-controlled trial show.

“The take-home message is that routine vitamin D supplementation, irrespective of the dosing regimen, is unlikely to be beneficial in a population with a low prevalence of vitamin D deficiency,” first author Rachel E. Neale, PhD, of the Population Health Department, QIMR Berghofer Medical Research Institute, in Brisbane, Australia, told this news organization.

Zbynek Pospisil/Getty Images

Despite extensive previous research on vitamin D supplementation, “mortality has not been the primary outcome in any previous large trial of high-dose vitamin D supplementation,” Dr. Neale and coauthors noted. The results, published online in Lancet Diabetes & Endocrinology, are from the D-Health trial.

With more than 20,000 participants, this is the largest intermittent-dosing trial to date, the authors noted. The primary outcome was all-cause mortality.

In an accompanying editorial, Inez Schoenmakers, PhD, noted that “the findings [are] highly relevant for population policy, owing to the study’s population-based design, large scale, and long duration.”

This new “research contributes to the concept that improving vitamin D status with supplementation in a mostly vitamin D-replete older population does not influence all-cause mortality,” Dr. Schoenmakers, of the Faculty of Medicine and Health Sciences, University of East Anglia, Norwich, England, said in an interview.

“This is not dissimilar to research with many other nutrients showing that increasing intake above the adequate intake has no further health benefits,” she added.
 

D-Health Trial

The D-Health Trial involved 21,315 participants in Australia, enrolled between February 2014 and June 2015, who had not been screened for vitamin D deficiency but were largely considered to be vitamin D replete. They were a mean age of 69.3 years and 54% were men.

Participants were randomized 1:1 to a once-monthly oral vitamin D3 supplementation of 60,000 IU (n = 10,662) or a placebo capsule (n = 10,653).

They were permitted to take up to 2,000 IU/day of supplemental vitamin D in addition to the study protocol and had no history of kidney stones, hypercalcemia, hyperparathyroidism, osteomalacia, or sarcoidosis.

Over a median follow-up of 5.7 years, there were 1,100 deaths: 562 in the vitamin D group (5.3%) and 538 in the placebo group (5.1%). With a hazard ratio (HR) for all-cause mortality of 1.04, the difference was not significant (P = .47).

There were also no significant differences in terms of mortality from cardiovascular disease (HR, 0.96; P = .77), cancer (HR, 1.15; P = .13), or other causes (HR, 0.83; P = .15).

Rates of total adverse events between the two groups, including hypercalcemia and kidney stones, were similar.

An exploratory analysis excluding the first 2 years of follow-up in fact showed a numerically higher hazard ratio for cancer mortality in the vitamin D group versus no supplementation (HR, 1.24; P = .05). However, the authors noted that the effect was “not apparent when the analysis was restricted to deaths that were coded by the study team and not officially coded.”

Nevertheless, “our findings, from a large study in an unscreened population, give pause to earlier reports that vitamin D supplements might reduce cancer mortality,” they underscored.

Retention and adherence in the study were high, each exceeding 80%. Although blood samples were not collected at baseline, samples from 3,943 randomly sampled participants during follow-up showed mean serum 25-hydroxy-vitamin D concentrations of 77 nmol/L in the placebo group and 115 nmol/L in the vitamin D group, both within the normal range of 50-125 nmol/L.
 

 

 

Findings supported by previous research

The trial results are consistent with those of prior large studies and meta-analyses of older adults with a low prevalence of vitamin D deficiency showing that vitamin D3 supplementation, regardless of whether taken daily or monthly, is not likely to have an effect on all-cause mortality.

In the US VITAL trial, recently published in the New England Journal of Medicine, among 25,871 participants administered 2,000 IU/day of vitamin D3 for a median of 5.3 years, there was no reduction in all-cause mortality.

The ViDA trial of 5,110 older adults in New Zealand, published in 2019 in the Journal of Endocrinological Investigation, also showed monthly vitamin D3 supplementation of 100,000 IU for a median of 3.3 years was not associated with a benefit in people who were not deficient.

“In total, the results from the large trials and meta-analyses suggest that routine supplementation of older adults in populations with a low prevalence of vitamin D deficiency is unlikely to reduce the rate of all-cause mortality,” Dr. Neale and colleagues concluded.
 

Longer-term supplementation beneficial?

The population was limited to older adults and the study had a relatively short follow-up period, which Dr. Neale noted was necessary for pragmatic reasons.

“Our primary outcome was all-cause mortality, so to have sufficient deaths we either needed to study older adults or a much larger sample of younger adults,” she explained.

“However, we felt that [the former] ... had biological justification, as there is evidence that vitamin D plays a role later in the course of a number of diseases, with potential impacts on mortality.”

She noted that recent studies evaluating genetically predicted concentrations of serum 25(OH)D have further shown no link between those levels and all-cause mortality, stroke, or coronary heart disease.

“This confirms the statement that vitamin D is unlikely to be beneficial in people who are not vitamin D deficient, irrespective of whether supplementation occurs over the short or longer term,” Dr. Neale said.

The source of vitamin D, itself, is another consideration, with ongoing speculation of differences in benefits between dietary or supplementation sources versus sunlight exposure.

“Exposure to ultraviolet radiation, for which serum 25(OH)D concentration is a good marker, might confer benefits not mediated by vitamin D,” Dr. Neale and coauthors noted.

They added that the results in the older Australian population “cannot be generalized to populations with a higher prevalence of vitamin D deficiency, or with a greater proportion of people not of White ancestry, than the study population.”

Ten-year mortality rates from the D-Health trial are expected to be reported in the future.
 

Strategies still needed to address vitamin D deficiency

Further commenting on the findings, Dr. Schoenmakers underscored that “vitamin D deficiency is very common worldwide, [and] more should be done to develop strategies to address the needs of those groups and populations that are at risk of the consequences of vitamin D deficiency.”

That said, the D-Health study is important in helping to distinguish when supplementation may – and may not – be of benefit, she noted.

“This and other research in the past 15 years have contributed to our understanding [of] what the ranges of vitamin D status are [in which] health consequences may be anticipated.”

The D-Health Trial was funded by the National Health and Medical Research Council. Dr. Neale and Dr. Schoenmakers have reported no relevant financial relationships. 


version of this article first appeared on Medscape.com.

Monthly supplementation with vitamin D3 (cholecalciferol) in older adults without deficiency has no significant benefit in terms of survival outcomes, including mortality linked to cardiovascular disease, new results from a large, placebo-controlled trial show.

“The take-home message is that routine vitamin D supplementation, irrespective of the dosing regimen, is unlikely to be beneficial in a population with a low prevalence of vitamin D deficiency,” first author Rachel E. Neale, PhD, of the Population Health Department, QIMR Berghofer Medical Research Institute, in Brisbane, Australia, told this news organization.

Zbynek Pospisil/Getty Images

Despite extensive previous research on vitamin D supplementation, “mortality has not been the primary outcome in any previous large trial of high-dose vitamin D supplementation,” Dr. Neale and coauthors noted. The results, published online in Lancet Diabetes & Endocrinology, are from the D-Health trial.

With more than 20,000 participants, this is the largest intermittent-dosing trial to date, the authors noted. The primary outcome was all-cause mortality.

In an accompanying editorial, Inez Schoenmakers, PhD, noted that “the findings [are] highly relevant for population policy, owing to the study’s population-based design, large scale, and long duration.”

This new “research contributes to the concept that improving vitamin D status with supplementation in a mostly vitamin D-replete older population does not influence all-cause mortality,” Dr. Schoenmakers, of the Faculty of Medicine and Health Sciences, University of East Anglia, Norwich, England, said in an interview.

“This is not dissimilar to research with many other nutrients showing that increasing intake above the adequate intake has no further health benefits,” she added.
 

D-Health Trial

The D-Health Trial involved 21,315 participants in Australia, enrolled between February 2014 and June 2015, who had not been screened for vitamin D deficiency but were largely considered to be vitamin D replete. They were a mean age of 69.3 years and 54% were men.

Participants were randomized 1:1 to a once-monthly oral vitamin D3 supplementation of 60,000 IU (n = 10,662) or a placebo capsule (n = 10,653).

They were permitted to take up to 2,000 IU/day of supplemental vitamin D in addition to the study protocol and had no history of kidney stones, hypercalcemia, hyperparathyroidism, osteomalacia, or sarcoidosis.

Over a median follow-up of 5.7 years, there were 1,100 deaths: 562 in the vitamin D group (5.3%) and 538 in the placebo group (5.1%). With a hazard ratio (HR) for all-cause mortality of 1.04, the difference was not significant (P = .47).

There were also no significant differences in terms of mortality from cardiovascular disease (HR, 0.96; P = .77), cancer (HR, 1.15; P = .13), or other causes (HR, 0.83; P = .15).

Rates of total adverse events between the two groups, including hypercalcemia and kidney stones, were similar.

An exploratory analysis excluding the first 2 years of follow-up in fact showed a numerically higher hazard ratio for cancer mortality in the vitamin D group versus no supplementation (HR, 1.24; P = .05). However, the authors noted that the effect was “not apparent when the analysis was restricted to deaths that were coded by the study team and not officially coded.”

Nevertheless, “our findings, from a large study in an unscreened population, give pause to earlier reports that vitamin D supplements might reduce cancer mortality,” they underscored.

Retention and adherence in the study were high, each exceeding 80%. Although blood samples were not collected at baseline, samples from 3,943 randomly sampled participants during follow-up showed mean serum 25-hydroxy-vitamin D concentrations of 77 nmol/L in the placebo group and 115 nmol/L in the vitamin D group, both within the normal range of 50-125 nmol/L.
 

 

 

Findings supported by previous research

The trial results are consistent with those of prior large studies and meta-analyses of older adults with a low prevalence of vitamin D deficiency showing that vitamin D3 supplementation, regardless of whether taken daily or monthly, is not likely to have an effect on all-cause mortality.

In the US VITAL trial, recently published in the New England Journal of Medicine, among 25,871 participants administered 2,000 IU/day of vitamin D3 for a median of 5.3 years, there was no reduction in all-cause mortality.

The ViDA trial of 5,110 older adults in New Zealand, published in 2019 in the Journal of Endocrinological Investigation, also showed monthly vitamin D3 supplementation of 100,000 IU for a median of 3.3 years was not associated with a benefit in people who were not deficient.

“In total, the results from the large trials and meta-analyses suggest that routine supplementation of older adults in populations with a low prevalence of vitamin D deficiency is unlikely to reduce the rate of all-cause mortality,” Dr. Neale and colleagues concluded.
 

Longer-term supplementation beneficial?

The population was limited to older adults and the study had a relatively short follow-up period, which Dr. Neale noted was necessary for pragmatic reasons.

“Our primary outcome was all-cause mortality, so to have sufficient deaths we either needed to study older adults or a much larger sample of younger adults,” she explained.

“However, we felt that [the former] ... had biological justification, as there is evidence that vitamin D plays a role later in the course of a number of diseases, with potential impacts on mortality.”

She noted that recent studies evaluating genetically predicted concentrations of serum 25(OH)D have further shown no link between those levels and all-cause mortality, stroke, or coronary heart disease.

“This confirms the statement that vitamin D is unlikely to be beneficial in people who are not vitamin D deficient, irrespective of whether supplementation occurs over the short or longer term,” Dr. Neale said.

The source of vitamin D, itself, is another consideration, with ongoing speculation of differences in benefits between dietary or supplementation sources versus sunlight exposure.

“Exposure to ultraviolet radiation, for which serum 25(OH)D concentration is a good marker, might confer benefits not mediated by vitamin D,” Dr. Neale and coauthors noted.

They added that the results in the older Australian population “cannot be generalized to populations with a higher prevalence of vitamin D deficiency, or with a greater proportion of people not of White ancestry, than the study population.”

Ten-year mortality rates from the D-Health trial are expected to be reported in the future.
 

Strategies still needed to address vitamin D deficiency

Further commenting on the findings, Dr. Schoenmakers underscored that “vitamin D deficiency is very common worldwide, [and] more should be done to develop strategies to address the needs of those groups and populations that are at risk of the consequences of vitamin D deficiency.”

That said, the D-Health study is important in helping to distinguish when supplementation may – and may not – be of benefit, she noted.

“This and other research in the past 15 years have contributed to our understanding [of] what the ranges of vitamin D status are [in which] health consequences may be anticipated.”

The D-Health Trial was funded by the National Health and Medical Research Council. Dr. Neale and Dr. Schoenmakers have reported no relevant financial relationships. 


version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE LANCET DIABETES & ENDOCRINOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Therapeutic drug monitoring with infliximab improves IBD disease control

Article Type
Changed
Wed, 01/26/2022 - 14:58

A proactive approach to therapeutic drug monitoring during maintenance therapy with infliximab in the treatment of inflammatory bowel diseases (IBDs) and other chronic immune-mediated inflammatory diseases significantly improves sustained disease control, compared with standard care, which was defined as no therapeutic drug monitoring, new research shows.

“This trial showed that therapeutic drug monitoring improved infliximab [maintenance] treatment by preventing disease flares without increasing drug consumption,” reported the authors of the study published in JAMA.

Infliximab and other tumor necrosis factor (TNF)–inhibitor drugs offer significant benefits in the treatment of IBDs and other chronic inflammatory diseases; however, up to 50% of patients become nonresponsive to the therapy within the first years of treatment, posing risks of disease worsening and possible organ damage.

Dr. Silje W. Syversen

To address the problem, therapeutic drug monitoring has been recommended, more so with IBD guidelines than rheumatoid arthritis; these have included measures such as dose adjustment or drug switching when there is evidence that disease control is not being maintained.

However, with low serum drug concentrations and the development of antidrug antibodies believed to be key indicators of a waning response, there is growing interest in a more proactive monitoring approach, involving scheduled assessments of serum and antibody levels, performed regardless of any signs of disease activity changes, and the provision of dosing adjustments, if needed.

In the earlier Norwegian Drug Monitoring (NORDRUM) A trial, first author Silje Watterdal Syversen, MD, PhD, of the division of rheumatology and research, Diakonhjemmet Hospital, Oslo, Norway, and colleagues evaluated the effects of the proactive drug monitoring approach during the initiation phase of infliximab treatment, but found no significant improvement in the study of 411 patients in terms of the primary outcome of remission rates.

For the current NORDRUM B trial, they sought to instead determine if benefits of the proactive therapeutic drug monitoring may be more apparent during the maintenance phase of infliximab treatment.

The trial involved 458 adults at 20 hospitals in Norway who had immune-mediated inflammatory diseases (IMIDs), including rheumatoid arthritis (n = 80), spondyloarthritis (n = 138), psoriatic arthritis (n = 54), ulcerative colitis (n = 81), Crohn’s disease (n = 68), or psoriasis (n = 37), and who were undergoing maintenance therapy with infliximab.

The patients, who had a median of about 40 weeks’ prior infliximab therapy, were randomized 1:1 to receive either proactive therapeutic drug monitoring, consisting of scheduled monitoring of serum drug levels and antidrug antibodies and adjustments in dose and intervals as needed according to a trial algorithm (n = 228), or standard infliximab therapy, without the regular drug and antibody level monitoring (n = 230).

Over a 52-week follow-up, the primary outcome of sustained disease control without disease worsening was significantly higher in the proactive therapeutic drug monitoring group (73.6%; n = 167) versus standard care (55.9%; n = 127; P < .001). The risk of disease worsening was meanwhile significantly greater with standard care versus proactive drug monitoring (hazard ratio, 2.1).

Serum infliximab levels remained within the therapeutic range throughout the study period in 30% of patients receiving proactive therapeutic drug monitoring, compared with 17% in the standard care group, and low serum infliximab levels (≤2 mg/L) occurred at least once during the study period among 19% and 27% of the two groups, respectively.

Clinically significant levels of antidrug antibodies (≥50 mcg/L) occurred in 9.2% of the therapeutic drug monitoring patients and 15.0% of the standard care group.

About 55% of patients were also using concomitant immunosuppressive therapy, and the findings were consistent among those who did and did not use the drugs. There no significant differences in adverse events between the therapeutic drug-monitoring (60%) and standard care groups (63%).

Importantly, while the mean dose of infliximab during the trial was 4.8 mg/kg in both groups, an increase in dosage after disease worsening was more common in the standard therapy group (51%) than in the therapeutic drug monitoring group (31.6%), underscoring the improved dose control provided with the proactive therapeutic drug monitoring, the authors note.

The findings suggest “proactive therapeutic drug monitoring might be more important during maintenance therapy, a period during which low drug levels could be an important risk factor for therapeutic failure,” the authors conclude.

In commenting to this news organizatin, Dr. Syversen noted that, despite the variety of IMIDs, there were no significant differences with IBDs or other disorders.

“Our data show consistent findings across all diseases included,” she said.

Furthermore, “in the present study, and [prior research] using the same definition of disease worsening, IBD patients in general had a comparable flare rate as compared to inflammatory arthritis.”
 

 

 

Caveats include severe illness, potential cost challenges

Commenting on the research, Stephen B. Hanauer, MD, noted that a potential exception to the lack of benefit previously observed during treatment induction could be patients with severe illness.

“In patients with more severe disease and in particular with low albumin, [which is] more common in IBD than other immune-mediated inflammatory diseases, there is rapid metabolism and clearance of monoclonal antibodies early that limit efficacy of standard induction dosing,” said Dr. Hanauer, a professor of medicine and medical director of the Digestive Health Center at Northwestern University, Chicago. 

Northwestern University
Dr. Stephen B. Hanauer

Noting that the average duration of treatment in the study prior to randomization was nearly a year (about 40 weeks), he added that a key question is “How to initiate therapeutic drug monitoring earlier in course to further optimize induction and prevent loss of response in maintenance.”

Dr. Hanauer shared that, “in our practice, we use a combination of proactive and reactive therapeutic drug monitoring based on individual patients and their history with prior biologics.”
 

Findings may usher in ‘new era’ in immune-mediated inflammatory disease treatment

In an editorial published in JAMA with the study, Zachary S. Wallace, MD, and Jeffrey A. Sparks, MD, both of Harvard Medical School in Boston, further commented that “maintaining disease control in nearly 3 of 4 patients represents a meaningful improvement over standard care.”

However, key challenges with the approach include the potential need for additional nurses and others to help monitor patients, and associated costs, which insurance providers may not always cover, they noted.

Another consideration is a lack of effective tools for monitoring measures including antidrug antibodies, they added, and “additional clinical trials within specific disease subgroups are needed.”

However, addressing such barriers “may help introduce a new era in treatment approach to maintenance therapy for patients with immune-mediated inflammatory diseases,” the editorialists write.

Niels Vande Casteele, PharmD, PhD, an associate professor at the department of medicine, University of California, San Diego, and affiliate faculty, Skaggs School of Pharmacy & Pharmaceutical Sciences, commented that the study is “an important milestone in the field of therapeutic drug monitoring of biologics for immunoinflammatory diseases.”

Dr. Niels Vande Casteele

“The ability of proactive therapeutic drug monitoring to achieve sustained disease control without increased drug consumption is [a] notable finding,” he said in an interview.

Noting a limitation, Dr. Casteele suggested the inclusion of more specific measures of disease activity could have provided clearer insights.

“In particular for gastrointestinal diseases, we know that symptoms do not correlate well with inflammatory disease activity,” he said. “As such, I would have preferred to see clinical symptoms being complemented with endoscopic and histologic finds to confirm disease activity.”

Ultimately, he said that the results suggest “proactive therapeutic drug monitoring is not required for all patients, but it is beneficial in some to achieve sustained disease control over a prolonged period of time.”

This study received funding by grants from the Norwegian Regional Health Authorities (interregional KLINBEFORSK grants) and the South-Eastern Norway Regional Health Authorities; study authors reported relationships with various pharmaceutical companies, including Pfizer, which makes infliximab. Dr. Wallace reported research support from Bristol Myers Squibb and Principia/Sanofi and consulting fees from Viela Bio, Zenas Biopharma, and MedPace. Dr. Sparks reported consultancy fees from AbbVie, Bristol Myers Squibb, Gilead, Inova Diagnostics, Janssen, Optum, and Pfizer. Dr. Hanauer is a consultant and lecturer for Abbvie, Janssen, and Takeda and consultant for Pfizer, Celltrion, Amgen, Samsung Bioepis. Dr. Casteele has received research grants and personal fees from R-Biopharm, Takeda and UCB; and personal fees from AcelaBio, Alimentiv, Celltrion, Prometheus, Procise DX, and Vividion for activities that were all outside of the reviewed study.

Help your patients better understand their IBD treatment options by sharing AGA’s patient education, “Living with IBD,” in the AGA GI Patient Center at www.gastro.org/IBD. 

Publications
Topics
Sections

A proactive approach to therapeutic drug monitoring during maintenance therapy with infliximab in the treatment of inflammatory bowel diseases (IBDs) and other chronic immune-mediated inflammatory diseases significantly improves sustained disease control, compared with standard care, which was defined as no therapeutic drug monitoring, new research shows.

“This trial showed that therapeutic drug monitoring improved infliximab [maintenance] treatment by preventing disease flares without increasing drug consumption,” reported the authors of the study published in JAMA.

Infliximab and other tumor necrosis factor (TNF)–inhibitor drugs offer significant benefits in the treatment of IBDs and other chronic inflammatory diseases; however, up to 50% of patients become nonresponsive to the therapy within the first years of treatment, posing risks of disease worsening and possible organ damage.

Dr. Silje W. Syversen

To address the problem, therapeutic drug monitoring has been recommended, more so with IBD guidelines than rheumatoid arthritis; these have included measures such as dose adjustment or drug switching when there is evidence that disease control is not being maintained.

However, with low serum drug concentrations and the development of antidrug antibodies believed to be key indicators of a waning response, there is growing interest in a more proactive monitoring approach, involving scheduled assessments of serum and antibody levels, performed regardless of any signs of disease activity changes, and the provision of dosing adjustments, if needed.

In the earlier Norwegian Drug Monitoring (NORDRUM) A trial, first author Silje Watterdal Syversen, MD, PhD, of the division of rheumatology and research, Diakonhjemmet Hospital, Oslo, Norway, and colleagues evaluated the effects of the proactive drug monitoring approach during the initiation phase of infliximab treatment, but found no significant improvement in the study of 411 patients in terms of the primary outcome of remission rates.

For the current NORDRUM B trial, they sought to instead determine if benefits of the proactive therapeutic drug monitoring may be more apparent during the maintenance phase of infliximab treatment.

The trial involved 458 adults at 20 hospitals in Norway who had immune-mediated inflammatory diseases (IMIDs), including rheumatoid arthritis (n = 80), spondyloarthritis (n = 138), psoriatic arthritis (n = 54), ulcerative colitis (n = 81), Crohn’s disease (n = 68), or psoriasis (n = 37), and who were undergoing maintenance therapy with infliximab.

The patients, who had a median of about 40 weeks’ prior infliximab therapy, were randomized 1:1 to receive either proactive therapeutic drug monitoring, consisting of scheduled monitoring of serum drug levels and antidrug antibodies and adjustments in dose and intervals as needed according to a trial algorithm (n = 228), or standard infliximab therapy, without the regular drug and antibody level monitoring (n = 230).

Over a 52-week follow-up, the primary outcome of sustained disease control without disease worsening was significantly higher in the proactive therapeutic drug monitoring group (73.6%; n = 167) versus standard care (55.9%; n = 127; P < .001). The risk of disease worsening was meanwhile significantly greater with standard care versus proactive drug monitoring (hazard ratio, 2.1).

Serum infliximab levels remained within the therapeutic range throughout the study period in 30% of patients receiving proactive therapeutic drug monitoring, compared with 17% in the standard care group, and low serum infliximab levels (≤2 mg/L) occurred at least once during the study period among 19% and 27% of the two groups, respectively.

Clinically significant levels of antidrug antibodies (≥50 mcg/L) occurred in 9.2% of the therapeutic drug monitoring patients and 15.0% of the standard care group.

About 55% of patients were also using concomitant immunosuppressive therapy, and the findings were consistent among those who did and did not use the drugs. There no significant differences in adverse events between the therapeutic drug-monitoring (60%) and standard care groups (63%).

Importantly, while the mean dose of infliximab during the trial was 4.8 mg/kg in both groups, an increase in dosage after disease worsening was more common in the standard therapy group (51%) than in the therapeutic drug monitoring group (31.6%), underscoring the improved dose control provided with the proactive therapeutic drug monitoring, the authors note.

The findings suggest “proactive therapeutic drug monitoring might be more important during maintenance therapy, a period during which low drug levels could be an important risk factor for therapeutic failure,” the authors conclude.

In commenting to this news organizatin, Dr. Syversen noted that, despite the variety of IMIDs, there were no significant differences with IBDs or other disorders.

“Our data show consistent findings across all diseases included,” she said.

Furthermore, “in the present study, and [prior research] using the same definition of disease worsening, IBD patients in general had a comparable flare rate as compared to inflammatory arthritis.”
 

 

 

Caveats include severe illness, potential cost challenges

Commenting on the research, Stephen B. Hanauer, MD, noted that a potential exception to the lack of benefit previously observed during treatment induction could be patients with severe illness.

“In patients with more severe disease and in particular with low albumin, [which is] more common in IBD than other immune-mediated inflammatory diseases, there is rapid metabolism and clearance of monoclonal antibodies early that limit efficacy of standard induction dosing,” said Dr. Hanauer, a professor of medicine and medical director of the Digestive Health Center at Northwestern University, Chicago. 

Northwestern University
Dr. Stephen B. Hanauer

Noting that the average duration of treatment in the study prior to randomization was nearly a year (about 40 weeks), he added that a key question is “How to initiate therapeutic drug monitoring earlier in course to further optimize induction and prevent loss of response in maintenance.”

Dr. Hanauer shared that, “in our practice, we use a combination of proactive and reactive therapeutic drug monitoring based on individual patients and their history with prior biologics.”
 

Findings may usher in ‘new era’ in immune-mediated inflammatory disease treatment

In an editorial published in JAMA with the study, Zachary S. Wallace, MD, and Jeffrey A. Sparks, MD, both of Harvard Medical School in Boston, further commented that “maintaining disease control in nearly 3 of 4 patients represents a meaningful improvement over standard care.”

However, key challenges with the approach include the potential need for additional nurses and others to help monitor patients, and associated costs, which insurance providers may not always cover, they noted.

Another consideration is a lack of effective tools for monitoring measures including antidrug antibodies, they added, and “additional clinical trials within specific disease subgroups are needed.”

However, addressing such barriers “may help introduce a new era in treatment approach to maintenance therapy for patients with immune-mediated inflammatory diseases,” the editorialists write.

Niels Vande Casteele, PharmD, PhD, an associate professor at the department of medicine, University of California, San Diego, and affiliate faculty, Skaggs School of Pharmacy & Pharmaceutical Sciences, commented that the study is “an important milestone in the field of therapeutic drug monitoring of biologics for immunoinflammatory diseases.”

Dr. Niels Vande Casteele

“The ability of proactive therapeutic drug monitoring to achieve sustained disease control without increased drug consumption is [a] notable finding,” he said in an interview.

Noting a limitation, Dr. Casteele suggested the inclusion of more specific measures of disease activity could have provided clearer insights.

“In particular for gastrointestinal diseases, we know that symptoms do not correlate well with inflammatory disease activity,” he said. “As such, I would have preferred to see clinical symptoms being complemented with endoscopic and histologic finds to confirm disease activity.”

Ultimately, he said that the results suggest “proactive therapeutic drug monitoring is not required for all patients, but it is beneficial in some to achieve sustained disease control over a prolonged period of time.”

This study received funding by grants from the Norwegian Regional Health Authorities (interregional KLINBEFORSK grants) and the South-Eastern Norway Regional Health Authorities; study authors reported relationships with various pharmaceutical companies, including Pfizer, which makes infliximab. Dr. Wallace reported research support from Bristol Myers Squibb and Principia/Sanofi and consulting fees from Viela Bio, Zenas Biopharma, and MedPace. Dr. Sparks reported consultancy fees from AbbVie, Bristol Myers Squibb, Gilead, Inova Diagnostics, Janssen, Optum, and Pfizer. Dr. Hanauer is a consultant and lecturer for Abbvie, Janssen, and Takeda and consultant for Pfizer, Celltrion, Amgen, Samsung Bioepis. Dr. Casteele has received research grants and personal fees from R-Biopharm, Takeda and UCB; and personal fees from AcelaBio, Alimentiv, Celltrion, Prometheus, Procise DX, and Vividion for activities that were all outside of the reviewed study.

Help your patients better understand their IBD treatment options by sharing AGA’s patient education, “Living with IBD,” in the AGA GI Patient Center at www.gastro.org/IBD. 

A proactive approach to therapeutic drug monitoring during maintenance therapy with infliximab in the treatment of inflammatory bowel diseases (IBDs) and other chronic immune-mediated inflammatory diseases significantly improves sustained disease control, compared with standard care, which was defined as no therapeutic drug monitoring, new research shows.

“This trial showed that therapeutic drug monitoring improved infliximab [maintenance] treatment by preventing disease flares without increasing drug consumption,” reported the authors of the study published in JAMA.

Infliximab and other tumor necrosis factor (TNF)–inhibitor drugs offer significant benefits in the treatment of IBDs and other chronic inflammatory diseases; however, up to 50% of patients become nonresponsive to the therapy within the first years of treatment, posing risks of disease worsening and possible organ damage.

Dr. Silje W. Syversen

To address the problem, therapeutic drug monitoring has been recommended, more so with IBD guidelines than rheumatoid arthritis; these have included measures such as dose adjustment or drug switching when there is evidence that disease control is not being maintained.

However, with low serum drug concentrations and the development of antidrug antibodies believed to be key indicators of a waning response, there is growing interest in a more proactive monitoring approach, involving scheduled assessments of serum and antibody levels, performed regardless of any signs of disease activity changes, and the provision of dosing adjustments, if needed.

In the earlier Norwegian Drug Monitoring (NORDRUM) A trial, first author Silje Watterdal Syversen, MD, PhD, of the division of rheumatology and research, Diakonhjemmet Hospital, Oslo, Norway, and colleagues evaluated the effects of the proactive drug monitoring approach during the initiation phase of infliximab treatment, but found no significant improvement in the study of 411 patients in terms of the primary outcome of remission rates.

For the current NORDRUM B trial, they sought to instead determine if benefits of the proactive therapeutic drug monitoring may be more apparent during the maintenance phase of infliximab treatment.

The trial involved 458 adults at 20 hospitals in Norway who had immune-mediated inflammatory diseases (IMIDs), including rheumatoid arthritis (n = 80), spondyloarthritis (n = 138), psoriatic arthritis (n = 54), ulcerative colitis (n = 81), Crohn’s disease (n = 68), or psoriasis (n = 37), and who were undergoing maintenance therapy with infliximab.

The patients, who had a median of about 40 weeks’ prior infliximab therapy, were randomized 1:1 to receive either proactive therapeutic drug monitoring, consisting of scheduled monitoring of serum drug levels and antidrug antibodies and adjustments in dose and intervals as needed according to a trial algorithm (n = 228), or standard infliximab therapy, without the regular drug and antibody level monitoring (n = 230).

Over a 52-week follow-up, the primary outcome of sustained disease control without disease worsening was significantly higher in the proactive therapeutic drug monitoring group (73.6%; n = 167) versus standard care (55.9%; n = 127; P < .001). The risk of disease worsening was meanwhile significantly greater with standard care versus proactive drug monitoring (hazard ratio, 2.1).

Serum infliximab levels remained within the therapeutic range throughout the study period in 30% of patients receiving proactive therapeutic drug monitoring, compared with 17% in the standard care group, and low serum infliximab levels (≤2 mg/L) occurred at least once during the study period among 19% and 27% of the two groups, respectively.

Clinically significant levels of antidrug antibodies (≥50 mcg/L) occurred in 9.2% of the therapeutic drug monitoring patients and 15.0% of the standard care group.

About 55% of patients were also using concomitant immunosuppressive therapy, and the findings were consistent among those who did and did not use the drugs. There no significant differences in adverse events between the therapeutic drug-monitoring (60%) and standard care groups (63%).

Importantly, while the mean dose of infliximab during the trial was 4.8 mg/kg in both groups, an increase in dosage after disease worsening was more common in the standard therapy group (51%) than in the therapeutic drug monitoring group (31.6%), underscoring the improved dose control provided with the proactive therapeutic drug monitoring, the authors note.

The findings suggest “proactive therapeutic drug monitoring might be more important during maintenance therapy, a period during which low drug levels could be an important risk factor for therapeutic failure,” the authors conclude.

In commenting to this news organizatin, Dr. Syversen noted that, despite the variety of IMIDs, there were no significant differences with IBDs or other disorders.

“Our data show consistent findings across all diseases included,” she said.

Furthermore, “in the present study, and [prior research] using the same definition of disease worsening, IBD patients in general had a comparable flare rate as compared to inflammatory arthritis.”
 

 

 

Caveats include severe illness, potential cost challenges

Commenting on the research, Stephen B. Hanauer, MD, noted that a potential exception to the lack of benefit previously observed during treatment induction could be patients with severe illness.

“In patients with more severe disease and in particular with low albumin, [which is] more common in IBD than other immune-mediated inflammatory diseases, there is rapid metabolism and clearance of monoclonal antibodies early that limit efficacy of standard induction dosing,” said Dr. Hanauer, a professor of medicine and medical director of the Digestive Health Center at Northwestern University, Chicago. 

Northwestern University
Dr. Stephen B. Hanauer

Noting that the average duration of treatment in the study prior to randomization was nearly a year (about 40 weeks), he added that a key question is “How to initiate therapeutic drug monitoring earlier in course to further optimize induction and prevent loss of response in maintenance.”

Dr. Hanauer shared that, “in our practice, we use a combination of proactive and reactive therapeutic drug monitoring based on individual patients and their history with prior biologics.”
 

Findings may usher in ‘new era’ in immune-mediated inflammatory disease treatment

In an editorial published in JAMA with the study, Zachary S. Wallace, MD, and Jeffrey A. Sparks, MD, both of Harvard Medical School in Boston, further commented that “maintaining disease control in nearly 3 of 4 patients represents a meaningful improvement over standard care.”

However, key challenges with the approach include the potential need for additional nurses and others to help monitor patients, and associated costs, which insurance providers may not always cover, they noted.

Another consideration is a lack of effective tools for monitoring measures including antidrug antibodies, they added, and “additional clinical trials within specific disease subgroups are needed.”

However, addressing such barriers “may help introduce a new era in treatment approach to maintenance therapy for patients with immune-mediated inflammatory diseases,” the editorialists write.

Niels Vande Casteele, PharmD, PhD, an associate professor at the department of medicine, University of California, San Diego, and affiliate faculty, Skaggs School of Pharmacy & Pharmaceutical Sciences, commented that the study is “an important milestone in the field of therapeutic drug monitoring of biologics for immunoinflammatory diseases.”

Dr. Niels Vande Casteele

“The ability of proactive therapeutic drug monitoring to achieve sustained disease control without increased drug consumption is [a] notable finding,” he said in an interview.

Noting a limitation, Dr. Casteele suggested the inclusion of more specific measures of disease activity could have provided clearer insights.

“In particular for gastrointestinal diseases, we know that symptoms do not correlate well with inflammatory disease activity,” he said. “As such, I would have preferred to see clinical symptoms being complemented with endoscopic and histologic finds to confirm disease activity.”

Ultimately, he said that the results suggest “proactive therapeutic drug monitoring is not required for all patients, but it is beneficial in some to achieve sustained disease control over a prolonged period of time.”

This study received funding by grants from the Norwegian Regional Health Authorities (interregional KLINBEFORSK grants) and the South-Eastern Norway Regional Health Authorities; study authors reported relationships with various pharmaceutical companies, including Pfizer, which makes infliximab. Dr. Wallace reported research support from Bristol Myers Squibb and Principia/Sanofi and consulting fees from Viela Bio, Zenas Biopharma, and MedPace. Dr. Sparks reported consultancy fees from AbbVie, Bristol Myers Squibb, Gilead, Inova Diagnostics, Janssen, Optum, and Pfizer. Dr. Hanauer is a consultant and lecturer for Abbvie, Janssen, and Takeda and consultant for Pfizer, Celltrion, Amgen, Samsung Bioepis. Dr. Casteele has received research grants and personal fees from R-Biopharm, Takeda and UCB; and personal fees from AcelaBio, Alimentiv, Celltrion, Prometheus, Procise DX, and Vividion for activities that were all outside of the reviewed study.

Help your patients better understand their IBD treatment options by sharing AGA’s patient education, “Living with IBD,” in the AGA GI Patient Center at www.gastro.org/IBD. 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Could probiotics reduce ‘chemo brain’ in breast cancer patients?

Article Type
Changed
Wed, 01/04/2023 - 17:16

Patients with breast cancer treated with chemotherapy who also took a probiotics supplement had significantly fewer symptoms of chemotherapy-related cognitive impairment (CRCI) often referred to as “chemo brain,” compared with a control group taking placebo capsules, reports the first study of its kind.

“Our finding[s] provide a simple, inexpensive, and effective prevention strategy for chemotherapy-related side effects, including cognitive impairment,” senior author Jianbin Tong, MD, PhD, of the department of anesthesiology, Third Xiangya Hospital, Central South University, Changsha, Hunan, China, said in an interview.

The research “is the first study showing that probiotics supplementation during chemotherapy can prevent chemotherapy-related brain impairment,” he noted.

The double-blind, randomized study was published in the European Journal of Cancer. It involved 159 patients in China with stage I-III breast cancer who required adjuvant chemotherapy between 2018 and 2019. These patients were randomized to receive a regimen of three capsules twice per day containing either probiotics (n = 80) or placebo (n = 79) during their chemotherapy.

The probiotic capsule (Bifico, Sine Pharmaceuticals) contained Bifidobacterium longumLactobacillus acidophilus, and Enterococcus faecalis (210 mg of each).

The reductions in symptoms seen with the supplementation “exceed our expectations,” Dr. Tong said in an interview.

He speculated that this may have longer-term effects, with the prevention of initial cognitive impairment potentially “changing the neurodegenerative trajectory of patients after chemotherapy.”

“Patients don’t need to take probiotics continuously, but it’s better to take probiotics intermittently,” he said.

Approached for comment, Melanie Sekeres, PhD, Canada Research Chair and assistant professor at the University of Ottawa, said the improvements, such as those seen in delayed recall, are especially of interest.

“This is particularly notable because one of the brain regions that is critically involved in long-term memory processing, the hippocampus, is known to be highly sensitive to chemotherapy-induced neurotoxicity,” she said in an interview.

“The finding that probiotic treatment given alongside chemotherapy is sufficient to, in part, protect against memory disturbances in these patients suggests that there may be some neuroprotection conferred by the probiotic treatment,” she said.

A key question is whether similar results would be seen with other chemotherapy regimens, Dr. Sekeres added. “To better understand the effectiveness of these probiotics in preventing CRCI, they should be tested using other classes of chemotherapies before any broad conclusions can be made.”
 

Measuring the effect on ‘chemo brain’

“Chemo brain” is commonly reported after chemotherapy, and some 35% of patients report having long-term effects. Key symptoms include deficits in memory, attention, and executive and processing speed skills.

In their study, Dr. Tong and colleagues assessed patients on their cognitive status with a number of validated neuropsychological battery tests 1 day prior to initiating chemotherapy and 21 days after the last cycle of chemotherapy. Tests included the Hopkins Verbal Learning Test–Revised for verbal memory, the Brief Visuospatial Memory Test–Revised for visuospatial memory, and various others.

The team reports that, after adjustment for confounding factors, the total incidence of CRCI was significantly lower in the probiotics group versus the placebo group 21 days post chemotherapy (35% vs. 81%; relative risk, 0.43).

Rates of mild cognitive impairment were also lower in the probiotics group (29% vs 52%; RR, 0.55), as were rates of moderate cognitive impairment (6% vs. 29%; RR, 0.22).

The improvements with probiotics were observed across most other neuropsychological domains, including instantaneous verbal memory and delayed visuospatial memory (for both, P = .003) and visuospatial interference and verbal fluency (for both, P < .001).

The greater improvements in the probiotics group were seen regardless of use of other medications or the type of chemotherapy regimen received, which could have included epirubicin or docetaxel and/or cyclophosphamide.

CRCI was more common in patients who were older and had lower education or a higher body mass index; however, the improvements in the probiotics group were observed regardless of those factors, the authors commented.

In addition to the reduction in cognitive impairment that was seen, the treatment with probiotics was also associated with lower blood glucose (mean, 4.96 vs. 5.30; P = .02) and lower LDL cholesterol (2.61 vs. 2.89; P = .03) versus placebo, while there were no significant differences between the groups prior to chemotherapy.

There were no reports of severe emesis or constipation (grade 3 or higher) in either group; however, the probiotics group did have a significantly lower incidence of both, the authors note.
 

 

 

How does it work?

The potential benefits with probiotics are theorized to result from stabilizing the colonic and bacterial disruptions that are caused by chemotherapy, potentially offsetting the neuroinflammation that is linked to the cancer treatment, the authors speculated.

A subanalysis of 78 stool samples from 20 patients in the study showed no differences in alpha diversity or beta diversity before or after chemotherapy; however, there were significant reductions in the abundance of Streptococcus and Tyzzerella (P = .023 and P = .033, respectively) in the probiotics group after chemotherapy.

Further analysis showed that probiotics supplement modulated the levels of nine plasma metabolites in patients with breast cancer, with the results suggesting that metabolites (including p-mentha-1,8-dien-7-ol) “may be modulators in preventing CRCI by probiotics,” the authors noted.
 

Benefits reported beyond breast cancer

A subsequent trial conducted by Dr. Tong and colleagues following the CRCI study further showed similar protective benefits with probiotics in the prevention of chemotherapy-related hand-foot syndrome and oral mucositis.

And in a recent study, the research team found evidence of probiotic supplements protecting against cognitive impairment in the elderly following surgery.

The study received support from the National Natural Science Foundation of China, Subproject of the National Key Research and Development Program Project of China, science and technology innovation platform and talent plan of Hunan province and Natural Science Foundation of Hunan Province.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Patients with breast cancer treated with chemotherapy who also took a probiotics supplement had significantly fewer symptoms of chemotherapy-related cognitive impairment (CRCI) often referred to as “chemo brain,” compared with a control group taking placebo capsules, reports the first study of its kind.

“Our finding[s] provide a simple, inexpensive, and effective prevention strategy for chemotherapy-related side effects, including cognitive impairment,” senior author Jianbin Tong, MD, PhD, of the department of anesthesiology, Third Xiangya Hospital, Central South University, Changsha, Hunan, China, said in an interview.

The research “is the first study showing that probiotics supplementation during chemotherapy can prevent chemotherapy-related brain impairment,” he noted.

The double-blind, randomized study was published in the European Journal of Cancer. It involved 159 patients in China with stage I-III breast cancer who required adjuvant chemotherapy between 2018 and 2019. These patients were randomized to receive a regimen of three capsules twice per day containing either probiotics (n = 80) or placebo (n = 79) during their chemotherapy.

The probiotic capsule (Bifico, Sine Pharmaceuticals) contained Bifidobacterium longumLactobacillus acidophilus, and Enterococcus faecalis (210 mg of each).

The reductions in symptoms seen with the supplementation “exceed our expectations,” Dr. Tong said in an interview.

He speculated that this may have longer-term effects, with the prevention of initial cognitive impairment potentially “changing the neurodegenerative trajectory of patients after chemotherapy.”

“Patients don’t need to take probiotics continuously, but it’s better to take probiotics intermittently,” he said.

Approached for comment, Melanie Sekeres, PhD, Canada Research Chair and assistant professor at the University of Ottawa, said the improvements, such as those seen in delayed recall, are especially of interest.

“This is particularly notable because one of the brain regions that is critically involved in long-term memory processing, the hippocampus, is known to be highly sensitive to chemotherapy-induced neurotoxicity,” she said in an interview.

“The finding that probiotic treatment given alongside chemotherapy is sufficient to, in part, protect against memory disturbances in these patients suggests that there may be some neuroprotection conferred by the probiotic treatment,” she said.

A key question is whether similar results would be seen with other chemotherapy regimens, Dr. Sekeres added. “To better understand the effectiveness of these probiotics in preventing CRCI, they should be tested using other classes of chemotherapies before any broad conclusions can be made.”
 

Measuring the effect on ‘chemo brain’

“Chemo brain” is commonly reported after chemotherapy, and some 35% of patients report having long-term effects. Key symptoms include deficits in memory, attention, and executive and processing speed skills.

In their study, Dr. Tong and colleagues assessed patients on their cognitive status with a number of validated neuropsychological battery tests 1 day prior to initiating chemotherapy and 21 days after the last cycle of chemotherapy. Tests included the Hopkins Verbal Learning Test–Revised for verbal memory, the Brief Visuospatial Memory Test–Revised for visuospatial memory, and various others.

The team reports that, after adjustment for confounding factors, the total incidence of CRCI was significantly lower in the probiotics group versus the placebo group 21 days post chemotherapy (35% vs. 81%; relative risk, 0.43).

Rates of mild cognitive impairment were also lower in the probiotics group (29% vs 52%; RR, 0.55), as were rates of moderate cognitive impairment (6% vs. 29%; RR, 0.22).

The improvements with probiotics were observed across most other neuropsychological domains, including instantaneous verbal memory and delayed visuospatial memory (for both, P = .003) and visuospatial interference and verbal fluency (for both, P < .001).

The greater improvements in the probiotics group were seen regardless of use of other medications or the type of chemotherapy regimen received, which could have included epirubicin or docetaxel and/or cyclophosphamide.

CRCI was more common in patients who were older and had lower education or a higher body mass index; however, the improvements in the probiotics group were observed regardless of those factors, the authors commented.

In addition to the reduction in cognitive impairment that was seen, the treatment with probiotics was also associated with lower blood glucose (mean, 4.96 vs. 5.30; P = .02) and lower LDL cholesterol (2.61 vs. 2.89; P = .03) versus placebo, while there were no significant differences between the groups prior to chemotherapy.

There were no reports of severe emesis or constipation (grade 3 or higher) in either group; however, the probiotics group did have a significantly lower incidence of both, the authors note.
 

 

 

How does it work?

The potential benefits with probiotics are theorized to result from stabilizing the colonic and bacterial disruptions that are caused by chemotherapy, potentially offsetting the neuroinflammation that is linked to the cancer treatment, the authors speculated.

A subanalysis of 78 stool samples from 20 patients in the study showed no differences in alpha diversity or beta diversity before or after chemotherapy; however, there were significant reductions in the abundance of Streptococcus and Tyzzerella (P = .023 and P = .033, respectively) in the probiotics group after chemotherapy.

Further analysis showed that probiotics supplement modulated the levels of nine plasma metabolites in patients with breast cancer, with the results suggesting that metabolites (including p-mentha-1,8-dien-7-ol) “may be modulators in preventing CRCI by probiotics,” the authors noted.
 

Benefits reported beyond breast cancer

A subsequent trial conducted by Dr. Tong and colleagues following the CRCI study further showed similar protective benefits with probiotics in the prevention of chemotherapy-related hand-foot syndrome and oral mucositis.

And in a recent study, the research team found evidence of probiotic supplements protecting against cognitive impairment in the elderly following surgery.

The study received support from the National Natural Science Foundation of China, Subproject of the National Key Research and Development Program Project of China, science and technology innovation platform and talent plan of Hunan province and Natural Science Foundation of Hunan Province.

A version of this article first appeared on Medscape.com.

Patients with breast cancer treated with chemotherapy who also took a probiotics supplement had significantly fewer symptoms of chemotherapy-related cognitive impairment (CRCI) often referred to as “chemo brain,” compared with a control group taking placebo capsules, reports the first study of its kind.

“Our finding[s] provide a simple, inexpensive, and effective prevention strategy for chemotherapy-related side effects, including cognitive impairment,” senior author Jianbin Tong, MD, PhD, of the department of anesthesiology, Third Xiangya Hospital, Central South University, Changsha, Hunan, China, said in an interview.

The research “is the first study showing that probiotics supplementation during chemotherapy can prevent chemotherapy-related brain impairment,” he noted.

The double-blind, randomized study was published in the European Journal of Cancer. It involved 159 patients in China with stage I-III breast cancer who required adjuvant chemotherapy between 2018 and 2019. These patients were randomized to receive a regimen of three capsules twice per day containing either probiotics (n = 80) or placebo (n = 79) during their chemotherapy.

The probiotic capsule (Bifico, Sine Pharmaceuticals) contained Bifidobacterium longumLactobacillus acidophilus, and Enterococcus faecalis (210 mg of each).

The reductions in symptoms seen with the supplementation “exceed our expectations,” Dr. Tong said in an interview.

He speculated that this may have longer-term effects, with the prevention of initial cognitive impairment potentially “changing the neurodegenerative trajectory of patients after chemotherapy.”

“Patients don’t need to take probiotics continuously, but it’s better to take probiotics intermittently,” he said.

Approached for comment, Melanie Sekeres, PhD, Canada Research Chair and assistant professor at the University of Ottawa, said the improvements, such as those seen in delayed recall, are especially of interest.

“This is particularly notable because one of the brain regions that is critically involved in long-term memory processing, the hippocampus, is known to be highly sensitive to chemotherapy-induced neurotoxicity,” she said in an interview.

“The finding that probiotic treatment given alongside chemotherapy is sufficient to, in part, protect against memory disturbances in these patients suggests that there may be some neuroprotection conferred by the probiotic treatment,” she said.

A key question is whether similar results would be seen with other chemotherapy regimens, Dr. Sekeres added. “To better understand the effectiveness of these probiotics in preventing CRCI, they should be tested using other classes of chemotherapies before any broad conclusions can be made.”
 

Measuring the effect on ‘chemo brain’

“Chemo brain” is commonly reported after chemotherapy, and some 35% of patients report having long-term effects. Key symptoms include deficits in memory, attention, and executive and processing speed skills.

In their study, Dr. Tong and colleagues assessed patients on their cognitive status with a number of validated neuropsychological battery tests 1 day prior to initiating chemotherapy and 21 days after the last cycle of chemotherapy. Tests included the Hopkins Verbal Learning Test–Revised for verbal memory, the Brief Visuospatial Memory Test–Revised for visuospatial memory, and various others.

The team reports that, after adjustment for confounding factors, the total incidence of CRCI was significantly lower in the probiotics group versus the placebo group 21 days post chemotherapy (35% vs. 81%; relative risk, 0.43).

Rates of mild cognitive impairment were also lower in the probiotics group (29% vs 52%; RR, 0.55), as were rates of moderate cognitive impairment (6% vs. 29%; RR, 0.22).

The improvements with probiotics were observed across most other neuropsychological domains, including instantaneous verbal memory and delayed visuospatial memory (for both, P = .003) and visuospatial interference and verbal fluency (for both, P < .001).

The greater improvements in the probiotics group were seen regardless of use of other medications or the type of chemotherapy regimen received, which could have included epirubicin or docetaxel and/or cyclophosphamide.

CRCI was more common in patients who were older and had lower education or a higher body mass index; however, the improvements in the probiotics group were observed regardless of those factors, the authors commented.

In addition to the reduction in cognitive impairment that was seen, the treatment with probiotics was also associated with lower blood glucose (mean, 4.96 vs. 5.30; P = .02) and lower LDL cholesterol (2.61 vs. 2.89; P = .03) versus placebo, while there were no significant differences between the groups prior to chemotherapy.

There were no reports of severe emesis or constipation (grade 3 or higher) in either group; however, the probiotics group did have a significantly lower incidence of both, the authors note.
 

 

 

How does it work?

The potential benefits with probiotics are theorized to result from stabilizing the colonic and bacterial disruptions that are caused by chemotherapy, potentially offsetting the neuroinflammation that is linked to the cancer treatment, the authors speculated.

A subanalysis of 78 stool samples from 20 patients in the study showed no differences in alpha diversity or beta diversity before or after chemotherapy; however, there were significant reductions in the abundance of Streptococcus and Tyzzerella (P = .023 and P = .033, respectively) in the probiotics group after chemotherapy.

Further analysis showed that probiotics supplement modulated the levels of nine plasma metabolites in patients with breast cancer, with the results suggesting that metabolites (including p-mentha-1,8-dien-7-ol) “may be modulators in preventing CRCI by probiotics,” the authors noted.
 

Benefits reported beyond breast cancer

A subsequent trial conducted by Dr. Tong and colleagues following the CRCI study further showed similar protective benefits with probiotics in the prevention of chemotherapy-related hand-foot syndrome and oral mucositis.

And in a recent study, the research team found evidence of probiotic supplements protecting against cognitive impairment in the elderly following surgery.

The study received support from the National Natural Science Foundation of China, Subproject of the National Key Research and Development Program Project of China, science and technology innovation platform and talent plan of Hunan province and Natural Science Foundation of Hunan Province.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE EUROPEAN JOURNAL OF CANCER

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Swallowable intragastric balloon shows significant weight loss

Article Type
Changed
Thu, 01/13/2022 - 15:27

The Allurion intragastric balloon (formerly the Elipse, Allurion Technologies), a novel balloon that is swallowed, requiring no surgery or endoscopic placement, shows high efficacy in achieving weight loss and an improved metabolic profile, with fewer adverse events than reported with other available gastric balloons, results from a meta-analysis show.

“We believe this analysis to be the most comprehensive review [of the Allurion balloon],” reported first author Daryl Ramai, MD, of the division of gastroenterology and hepatology, University of Utah, Salt Lake City, and colleagues in the research, published in the November/December 2021 issue of the Journal of Clinical Gastroenterology.

“Our study showed that the Allurion balloon reduces waist circumference and triglyceride levels and [is] associated with less adverse events when compared with other intragastric balloons,” the authors concluded.

Unlike other balloons, the Allurion gastric balloon is compressed into a small capsule that is connected to a thin catheter and, once swallowed, it is then inflated with 550 mL of liquid through the catheter to create a feeling of fullness and help control hunger.

The procedure can be performed on an outpatient basis in approximately 20 minutes, potentially avoiding the burden and extra costs of surgery or endoscopic placement and removal. After approximately 4 months, the balloon is designed to empty through a valve that spontaneously opens, and the balloon is then passed in the stool.

Though currently used around the world, the balloon does not yet have approval from the Food and Drug Administration.
 

Meta-analysis shows 12.2% average weight loss across studies

To assess the balloon’s performance, the authors identified 7 out of 273 published studies that met the analysis criteria. The studies included 2,152 patients, ranging in age from 18 to 65 years, with a mean baseline body mass index of 32.1-38.6 kg/m2.

All of the studies were prospective, with reported outcomes at 3-4 months, when the Allurion balloon typically deflates. Three of the studies were multicenter, while four were single center.

In terms of improvements in BMI, the results showed the pooled mean difference from baseline through to the end of the studies was 0.88 (P = .001), and the weighted average percentage of total body weight loss during treatment across the studies was 12.2%.

The mean excess body weight loss across the Allurion studies was 49.1%.

The analysis was not designed to directly compare outcomes with other balloons, but the authors note, for instance, that the ReShape Duo intragastric balloon (an FDA-approved dual-balloon system) has been reported in a previous study to be associated with a percentage of total body weight loss of 7.6% at 6 months, compared with 3.6% observed among those with lifestyle modifications.

However, a separate meta-analysis showed the pooled percentage of total body weight loss with the FDA-approved Orbera balloon to be about the same as the current Allurion analysis, at 12.3% at 3 months after implantation (followed by 13.2% at 6 months and 11.3% at 12 months). The analysis further showed excess body weight loss with the Orbera balloon at 12 months to be 25.4%.

In other outcomes, the current meta-analysis also showed significant improvements with the Allurion balloon in waist circumference of 0.89 (P = .001) and in triglyceride levels of 0.66 (P = .004) versus baseline.

Previous research involving the FDA-approved Obalon intragastric balloon, which is inflated with gas rather than liquid, showed a significant reduction in waist circumference from 109 cm (±12.3) to 99 cm (±10.5) (P < .05), and another study showed that 37.5% of patients receiving the Orbera balloon had normalized triglyceride levels after 4 months, without concomitant medical therapy.
 

 

 

Adverse events appear lower vs. other balloons

Potential risks associated with the Allurion balloon include the potential for early deflation; however, the pooled rate of early balloon deflation observed in the meta-analysis was relatively low at 1.8%.

Other adverse events reported with the Allurion balloon were abdominal pain (37.5%), vomiting (29.6%), diarrhea (15.4%), and small bowel obstruction (0.5%).

The corresponding rates of abdominal pain with the ReShape Duo and Orbera balloons have been reported at 54.5% and 57.5%, respectively, with the effects possibly caused by overinflation, the authors noted.

And rates of vomiting with the ReShape Duo and Orbera balloons have been reported as much higher, at 86.7% and 86.8%, respectively.

Of note, there were no deaths or cases of acute pancreatitis reported in the meta-analysis studies of Allurion.

As reported by this news organization, such concerns have been raised in previous FDA alerts regarding the Orbera and ReShape Duo liquid-filled intragastric balloons.

In the most recent update, issued in April 2020, the FDA described receiving reports of 18 deaths that had occurred worldwide since the approvals of the Orbera and ReShape balloons, including eight in the United States.

Dr. Ramai noted that the concern about the issues is warranted.

“These concerns are valid,” he told this news organization. “Theoretically, since the Allurion balloon is placed for a shorter time span, it is conceivable that there may be less adverse events. However, comparative trials are needed to confirm this.”

Although the balloons show efficacy in patients struggling with weight loss, metabolic syndrome, and fatty liver disease, “the type and duration of intragastric balloons should be tailored to the patient,” Dr. Ramai said.

“Clinicians should thoroughly discuss with their patients the benefits and risks of using an intragastric balloon,” he added. “Furthermore, placement of intragastric balloons should only be attempted by clinicians with expertise in bariatric endoscopy.”

The study received no financial support. Dr. Ramai reported no relevant financial relationships.

A version of this article first appeared on Medscape.com,

Publications
Topics
Sections

The Allurion intragastric balloon (formerly the Elipse, Allurion Technologies), a novel balloon that is swallowed, requiring no surgery or endoscopic placement, shows high efficacy in achieving weight loss and an improved metabolic profile, with fewer adverse events than reported with other available gastric balloons, results from a meta-analysis show.

“We believe this analysis to be the most comprehensive review [of the Allurion balloon],” reported first author Daryl Ramai, MD, of the division of gastroenterology and hepatology, University of Utah, Salt Lake City, and colleagues in the research, published in the November/December 2021 issue of the Journal of Clinical Gastroenterology.

“Our study showed that the Allurion balloon reduces waist circumference and triglyceride levels and [is] associated with less adverse events when compared with other intragastric balloons,” the authors concluded.

Unlike other balloons, the Allurion gastric balloon is compressed into a small capsule that is connected to a thin catheter and, once swallowed, it is then inflated with 550 mL of liquid through the catheter to create a feeling of fullness and help control hunger.

The procedure can be performed on an outpatient basis in approximately 20 minutes, potentially avoiding the burden and extra costs of surgery or endoscopic placement and removal. After approximately 4 months, the balloon is designed to empty through a valve that spontaneously opens, and the balloon is then passed in the stool.

Though currently used around the world, the balloon does not yet have approval from the Food and Drug Administration.
 

Meta-analysis shows 12.2% average weight loss across studies

To assess the balloon’s performance, the authors identified 7 out of 273 published studies that met the analysis criteria. The studies included 2,152 patients, ranging in age from 18 to 65 years, with a mean baseline body mass index of 32.1-38.6 kg/m2.

All of the studies were prospective, with reported outcomes at 3-4 months, when the Allurion balloon typically deflates. Three of the studies were multicenter, while four were single center.

In terms of improvements in BMI, the results showed the pooled mean difference from baseline through to the end of the studies was 0.88 (P = .001), and the weighted average percentage of total body weight loss during treatment across the studies was 12.2%.

The mean excess body weight loss across the Allurion studies was 49.1%.

The analysis was not designed to directly compare outcomes with other balloons, but the authors note, for instance, that the ReShape Duo intragastric balloon (an FDA-approved dual-balloon system) has been reported in a previous study to be associated with a percentage of total body weight loss of 7.6% at 6 months, compared with 3.6% observed among those with lifestyle modifications.

However, a separate meta-analysis showed the pooled percentage of total body weight loss with the FDA-approved Orbera balloon to be about the same as the current Allurion analysis, at 12.3% at 3 months after implantation (followed by 13.2% at 6 months and 11.3% at 12 months). The analysis further showed excess body weight loss with the Orbera balloon at 12 months to be 25.4%.

In other outcomes, the current meta-analysis also showed significant improvements with the Allurion balloon in waist circumference of 0.89 (P = .001) and in triglyceride levels of 0.66 (P = .004) versus baseline.

Previous research involving the FDA-approved Obalon intragastric balloon, which is inflated with gas rather than liquid, showed a significant reduction in waist circumference from 109 cm (±12.3) to 99 cm (±10.5) (P < .05), and another study showed that 37.5% of patients receiving the Orbera balloon had normalized triglyceride levels after 4 months, without concomitant medical therapy.
 

 

 

Adverse events appear lower vs. other balloons

Potential risks associated with the Allurion balloon include the potential for early deflation; however, the pooled rate of early balloon deflation observed in the meta-analysis was relatively low at 1.8%.

Other adverse events reported with the Allurion balloon were abdominal pain (37.5%), vomiting (29.6%), diarrhea (15.4%), and small bowel obstruction (0.5%).

The corresponding rates of abdominal pain with the ReShape Duo and Orbera balloons have been reported at 54.5% and 57.5%, respectively, with the effects possibly caused by overinflation, the authors noted.

And rates of vomiting with the ReShape Duo and Orbera balloons have been reported as much higher, at 86.7% and 86.8%, respectively.

Of note, there were no deaths or cases of acute pancreatitis reported in the meta-analysis studies of Allurion.

As reported by this news organization, such concerns have been raised in previous FDA alerts regarding the Orbera and ReShape Duo liquid-filled intragastric balloons.

In the most recent update, issued in April 2020, the FDA described receiving reports of 18 deaths that had occurred worldwide since the approvals of the Orbera and ReShape balloons, including eight in the United States.

Dr. Ramai noted that the concern about the issues is warranted.

“These concerns are valid,” he told this news organization. “Theoretically, since the Allurion balloon is placed for a shorter time span, it is conceivable that there may be less adverse events. However, comparative trials are needed to confirm this.”

Although the balloons show efficacy in patients struggling with weight loss, metabolic syndrome, and fatty liver disease, “the type and duration of intragastric balloons should be tailored to the patient,” Dr. Ramai said.

“Clinicians should thoroughly discuss with their patients the benefits and risks of using an intragastric balloon,” he added. “Furthermore, placement of intragastric balloons should only be attempted by clinicians with expertise in bariatric endoscopy.”

The study received no financial support. Dr. Ramai reported no relevant financial relationships.

A version of this article first appeared on Medscape.com,

The Allurion intragastric balloon (formerly the Elipse, Allurion Technologies), a novel balloon that is swallowed, requiring no surgery or endoscopic placement, shows high efficacy in achieving weight loss and an improved metabolic profile, with fewer adverse events than reported with other available gastric balloons, results from a meta-analysis show.

“We believe this analysis to be the most comprehensive review [of the Allurion balloon],” reported first author Daryl Ramai, MD, of the division of gastroenterology and hepatology, University of Utah, Salt Lake City, and colleagues in the research, published in the November/December 2021 issue of the Journal of Clinical Gastroenterology.

“Our study showed that the Allurion balloon reduces waist circumference and triglyceride levels and [is] associated with less adverse events when compared with other intragastric balloons,” the authors concluded.

Unlike other balloons, the Allurion gastric balloon is compressed into a small capsule that is connected to a thin catheter and, once swallowed, it is then inflated with 550 mL of liquid through the catheter to create a feeling of fullness and help control hunger.

The procedure can be performed on an outpatient basis in approximately 20 minutes, potentially avoiding the burden and extra costs of surgery or endoscopic placement and removal. After approximately 4 months, the balloon is designed to empty through a valve that spontaneously opens, and the balloon is then passed in the stool.

Though currently used around the world, the balloon does not yet have approval from the Food and Drug Administration.
 

Meta-analysis shows 12.2% average weight loss across studies

To assess the balloon’s performance, the authors identified 7 out of 273 published studies that met the analysis criteria. The studies included 2,152 patients, ranging in age from 18 to 65 years, with a mean baseline body mass index of 32.1-38.6 kg/m2.

All of the studies were prospective, with reported outcomes at 3-4 months, when the Allurion balloon typically deflates. Three of the studies were multicenter, while four were single center.

In terms of improvements in BMI, the results showed the pooled mean difference from baseline through to the end of the studies was 0.88 (P = .001), and the weighted average percentage of total body weight loss during treatment across the studies was 12.2%.

The mean excess body weight loss across the Allurion studies was 49.1%.

The analysis was not designed to directly compare outcomes with other balloons, but the authors note, for instance, that the ReShape Duo intragastric balloon (an FDA-approved dual-balloon system) has been reported in a previous study to be associated with a percentage of total body weight loss of 7.6% at 6 months, compared with 3.6% observed among those with lifestyle modifications.

However, a separate meta-analysis showed the pooled percentage of total body weight loss with the FDA-approved Orbera balloon to be about the same as the current Allurion analysis, at 12.3% at 3 months after implantation (followed by 13.2% at 6 months and 11.3% at 12 months). The analysis further showed excess body weight loss with the Orbera balloon at 12 months to be 25.4%.

In other outcomes, the current meta-analysis also showed significant improvements with the Allurion balloon in waist circumference of 0.89 (P = .001) and in triglyceride levels of 0.66 (P = .004) versus baseline.

Previous research involving the FDA-approved Obalon intragastric balloon, which is inflated with gas rather than liquid, showed a significant reduction in waist circumference from 109 cm (±12.3) to 99 cm (±10.5) (P < .05), and another study showed that 37.5% of patients receiving the Orbera balloon had normalized triglyceride levels after 4 months, without concomitant medical therapy.
 

 

 

Adverse events appear lower vs. other balloons

Potential risks associated with the Allurion balloon include the potential for early deflation; however, the pooled rate of early balloon deflation observed in the meta-analysis was relatively low at 1.8%.

Other adverse events reported with the Allurion balloon were abdominal pain (37.5%), vomiting (29.6%), diarrhea (15.4%), and small bowel obstruction (0.5%).

The corresponding rates of abdominal pain with the ReShape Duo and Orbera balloons have been reported at 54.5% and 57.5%, respectively, with the effects possibly caused by overinflation, the authors noted.

And rates of vomiting with the ReShape Duo and Orbera balloons have been reported as much higher, at 86.7% and 86.8%, respectively.

Of note, there were no deaths or cases of acute pancreatitis reported in the meta-analysis studies of Allurion.

As reported by this news organization, such concerns have been raised in previous FDA alerts regarding the Orbera and ReShape Duo liquid-filled intragastric balloons.

In the most recent update, issued in April 2020, the FDA described receiving reports of 18 deaths that had occurred worldwide since the approvals of the Orbera and ReShape balloons, including eight in the United States.

Dr. Ramai noted that the concern about the issues is warranted.

“These concerns are valid,” he told this news organization. “Theoretically, since the Allurion balloon is placed for a shorter time span, it is conceivable that there may be less adverse events. However, comparative trials are needed to confirm this.”

Although the balloons show efficacy in patients struggling with weight loss, metabolic syndrome, and fatty liver disease, “the type and duration of intragastric balloons should be tailored to the patient,” Dr. Ramai said.

“Clinicians should thoroughly discuss with their patients the benefits and risks of using an intragastric balloon,” he added. “Furthermore, placement of intragastric balloons should only be attempted by clinicians with expertise in bariatric endoscopy.”

The study received no financial support. Dr. Ramai reported no relevant financial relationships.

A version of this article first appeared on Medscape.com,

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF CLINICAL GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

HIV+ patients get good outcomes after kidney or liver transplant

Article Type
Changed
Thu, 01/06/2022 - 13:38

Liver or kidney transplant recipients who are HIV-positive show outcomes that are similar to those without HIV at 15-years post-transplant, in new research that represents some of the longest follow-up on these patients to date.

The findings further support the inclusion of people with HIV in transplant resource allocation, say the researchers.

“Overall, the excellent outcomes following liver and kidney transplant recipients in HIV-infected recipients justify the utilization of a scarce resource,” senior author Peter G. Stock, MD, PhD, surgical director of the Kidney and Pancreas Transplant Program and surgical director of the Pediatric Renal Transplant Program at the University of California, San Francisco (UCSF), said in an interview.

“Many centers still view HIV as a strict contraindication [for transplantation]. This data shows it is not,” he emphasized.

The study, published in JAMA Surgery, involved HIV-positive patients who received kidney or liver transplants between 2000 and 2019 at UCSF, which has unique access to some of the longest-term data on those outcomes.

“UCSF was the first U.S. center to do transplants routinely in people with HIV, and based on the large volume of transplants that are performed, we were able to use propensity matching to address the comparison of HIV-positive and negative liver and kidney transplant recipients at a single center,” Dr. Stock explained.

“To the best of our knowledge, there are no long-term reports [greater than 10 years] on [transplant] outcomes in the HIV-positive population.”

Commenting on the study, David Klassen, MD, chief medical officer of the United Network for Organ Sharing (UNOS), noted that the findings “confirm previous research done at UCSF and reported in the New England Journal of Medicine” in 2010. “It extends the previous findings.”

“The take-home message is that these HIV-positive patients can be successfully transplanted with expected good outcomes and will derive substantial benefit from transplantation,” Dr. Klassen said.
 

Kidney transplant patient survival lower, graft survival similar

For the kidney transplant analysis, 119 HIV-positive recipients were propensity matched with 655 recipients who were HIV-negative, with the patients’ mean age about 52 and approximately 70% male.

At 15-years post-transplant, patient survival was 53.6% among the HIV-positive patients versus 79.6% for HIV-negative (P = .03).

Graft survival among the kidney transplant patients was proportionally higher among HIV-positive patients after 15 years (75% vs. 57%); however, the difference was not statistically significant (P = .77).

First author Arya Zarinsefat, MD, of the Department of Surgery at UCSF, speculated that the lower long-term patient survival among HIV-positive kidney transplant recipients may reflect known cardiovascular risks among those patients.

“We postulated that part of this may be due to the fact that HIV-positive patients certainly have additional comorbidities, specifically cardiovascular” ones, he told this news organization.

“When looking at the survival curve, survival was nearly identical at 5 years and only started to diverge at 10 years post-transplant,” he noted.

A further evaluation of patients with HIV who were co-infected with hepatitis C (HCV) showed that those with HIV-HCV co-infection prior to the center’s introduction of anti-HCV direct-acting antiviral (DAA) medications in 2014 had the lowest survival rate of all subgroups, at 57.1% at 5 years post-transplant (P = .045 vs. those treated after 2014).
 

 

 

Liver transplant patient survival similar

In terms of liver transplant outcomes, among 83 HIV-positive recipients who were propensity-matched with 468 HIV-negative recipients, the mean age was about 53 and about 66% were male.

The patient survival rates at 15 years were not significantly different between the groups, at 70% for HIV-positive and 75.7% for HIV-negative, (P = .12).

Similar to the kidney transplant recipients, the worst survival among all liver transplant subgroups was among HIV-HCV co-infected patients prior to access to HCV direct-acting antivirals in 2014, with a 5-year survival of 59.5% (P = .04).

“Since the advent of HCV direct-acting antivirals, liver transplant outcomes in HCV mono-infected patients are comparable to HCV/HIV co-infected recipients,” Dr. Stock said.
 

Acute rejection rates higher with HIV-positivity versus national averages

The rates of acute rejection at 1 year in the kidney and liver transplant, HIV-positive groups – at about 20% and 30%, respectively – were, however, higher than national average incidence rates of about 10% at 1 year.

Long-term data on those patients showed the acute rejection affected graft survival outcomes with kidney transplant recipients: HIV-positive kidney transplant recipients who had at least one episode of acute rejection had a graft survival of just 52.8% at 15 years post-transplant, compared with 91.8% among recipients without acute rejection.

Such differences were not observed among HIV-positive liver transplant recipients.

The authors note that the increased risk of acute rejection in HIV-positive kidney transplant patients is consistent with previous studies, with causes that may be multifactorial.

Top theories include drug interactions with protease inhibitors, resulting in some centers transitioning HIV-infected patients from those regimens to integrase-based regimens prior to transplant.

“The management and prevention of acute rejection in HIV-positive kidney transplant [patients] will therefore continue to be a key component in the care of these patients,” the authors note in their study.

The study was supported in part by the National Institutes of Health. The study authors and Dr. Klassen have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Liver or kidney transplant recipients who are HIV-positive show outcomes that are similar to those without HIV at 15-years post-transplant, in new research that represents some of the longest follow-up on these patients to date.

The findings further support the inclusion of people with HIV in transplant resource allocation, say the researchers.

“Overall, the excellent outcomes following liver and kidney transplant recipients in HIV-infected recipients justify the utilization of a scarce resource,” senior author Peter G. Stock, MD, PhD, surgical director of the Kidney and Pancreas Transplant Program and surgical director of the Pediatric Renal Transplant Program at the University of California, San Francisco (UCSF), said in an interview.

“Many centers still view HIV as a strict contraindication [for transplantation]. This data shows it is not,” he emphasized.

The study, published in JAMA Surgery, involved HIV-positive patients who received kidney or liver transplants between 2000 and 2019 at UCSF, which has unique access to some of the longest-term data on those outcomes.

“UCSF was the first U.S. center to do transplants routinely in people with HIV, and based on the large volume of transplants that are performed, we were able to use propensity matching to address the comparison of HIV-positive and negative liver and kidney transplant recipients at a single center,” Dr. Stock explained.

“To the best of our knowledge, there are no long-term reports [greater than 10 years] on [transplant] outcomes in the HIV-positive population.”

Commenting on the study, David Klassen, MD, chief medical officer of the United Network for Organ Sharing (UNOS), noted that the findings “confirm previous research done at UCSF and reported in the New England Journal of Medicine” in 2010. “It extends the previous findings.”

“The take-home message is that these HIV-positive patients can be successfully transplanted with expected good outcomes and will derive substantial benefit from transplantation,” Dr. Klassen said.
 

Kidney transplant patient survival lower, graft survival similar

For the kidney transplant analysis, 119 HIV-positive recipients were propensity matched with 655 recipients who were HIV-negative, with the patients’ mean age about 52 and approximately 70% male.

At 15-years post-transplant, patient survival was 53.6% among the HIV-positive patients versus 79.6% for HIV-negative (P = .03).

Graft survival among the kidney transplant patients was proportionally higher among HIV-positive patients after 15 years (75% vs. 57%); however, the difference was not statistically significant (P = .77).

First author Arya Zarinsefat, MD, of the Department of Surgery at UCSF, speculated that the lower long-term patient survival among HIV-positive kidney transplant recipients may reflect known cardiovascular risks among those patients.

“We postulated that part of this may be due to the fact that HIV-positive patients certainly have additional comorbidities, specifically cardiovascular” ones, he told this news organization.

“When looking at the survival curve, survival was nearly identical at 5 years and only started to diverge at 10 years post-transplant,” he noted.

A further evaluation of patients with HIV who were co-infected with hepatitis C (HCV) showed that those with HIV-HCV co-infection prior to the center’s introduction of anti-HCV direct-acting antiviral (DAA) medications in 2014 had the lowest survival rate of all subgroups, at 57.1% at 5 years post-transplant (P = .045 vs. those treated after 2014).
 

 

 

Liver transplant patient survival similar

In terms of liver transplant outcomes, among 83 HIV-positive recipients who were propensity-matched with 468 HIV-negative recipients, the mean age was about 53 and about 66% were male.

The patient survival rates at 15 years were not significantly different between the groups, at 70% for HIV-positive and 75.7% for HIV-negative, (P = .12).

Similar to the kidney transplant recipients, the worst survival among all liver transplant subgroups was among HIV-HCV co-infected patients prior to access to HCV direct-acting antivirals in 2014, with a 5-year survival of 59.5% (P = .04).

“Since the advent of HCV direct-acting antivirals, liver transplant outcomes in HCV mono-infected patients are comparable to HCV/HIV co-infected recipients,” Dr. Stock said.
 

Acute rejection rates higher with HIV-positivity versus national averages

The rates of acute rejection at 1 year in the kidney and liver transplant, HIV-positive groups – at about 20% and 30%, respectively – were, however, higher than national average incidence rates of about 10% at 1 year.

Long-term data on those patients showed the acute rejection affected graft survival outcomes with kidney transplant recipients: HIV-positive kidney transplant recipients who had at least one episode of acute rejection had a graft survival of just 52.8% at 15 years post-transplant, compared with 91.8% among recipients without acute rejection.

Such differences were not observed among HIV-positive liver transplant recipients.

The authors note that the increased risk of acute rejection in HIV-positive kidney transplant patients is consistent with previous studies, with causes that may be multifactorial.

Top theories include drug interactions with protease inhibitors, resulting in some centers transitioning HIV-infected patients from those regimens to integrase-based regimens prior to transplant.

“The management and prevention of acute rejection in HIV-positive kidney transplant [patients] will therefore continue to be a key component in the care of these patients,” the authors note in their study.

The study was supported in part by the National Institutes of Health. The study authors and Dr. Klassen have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Liver or kidney transplant recipients who are HIV-positive show outcomes that are similar to those without HIV at 15-years post-transplant, in new research that represents some of the longest follow-up on these patients to date.

The findings further support the inclusion of people with HIV in transplant resource allocation, say the researchers.

“Overall, the excellent outcomes following liver and kidney transplant recipients in HIV-infected recipients justify the utilization of a scarce resource,” senior author Peter G. Stock, MD, PhD, surgical director of the Kidney and Pancreas Transplant Program and surgical director of the Pediatric Renal Transplant Program at the University of California, San Francisco (UCSF), said in an interview.

“Many centers still view HIV as a strict contraindication [for transplantation]. This data shows it is not,” he emphasized.

The study, published in JAMA Surgery, involved HIV-positive patients who received kidney or liver transplants between 2000 and 2019 at UCSF, which has unique access to some of the longest-term data on those outcomes.

“UCSF was the first U.S. center to do transplants routinely in people with HIV, and based on the large volume of transplants that are performed, we were able to use propensity matching to address the comparison of HIV-positive and negative liver and kidney transplant recipients at a single center,” Dr. Stock explained.

“To the best of our knowledge, there are no long-term reports [greater than 10 years] on [transplant] outcomes in the HIV-positive population.”

Commenting on the study, David Klassen, MD, chief medical officer of the United Network for Organ Sharing (UNOS), noted that the findings “confirm previous research done at UCSF and reported in the New England Journal of Medicine” in 2010. “It extends the previous findings.”

“The take-home message is that these HIV-positive patients can be successfully transplanted with expected good outcomes and will derive substantial benefit from transplantation,” Dr. Klassen said.
 

Kidney transplant patient survival lower, graft survival similar

For the kidney transplant analysis, 119 HIV-positive recipients were propensity matched with 655 recipients who were HIV-negative, with the patients’ mean age about 52 and approximately 70% male.

At 15-years post-transplant, patient survival was 53.6% among the HIV-positive patients versus 79.6% for HIV-negative (P = .03).

Graft survival among the kidney transplant patients was proportionally higher among HIV-positive patients after 15 years (75% vs. 57%); however, the difference was not statistically significant (P = .77).

First author Arya Zarinsefat, MD, of the Department of Surgery at UCSF, speculated that the lower long-term patient survival among HIV-positive kidney transplant recipients may reflect known cardiovascular risks among those patients.

“We postulated that part of this may be due to the fact that HIV-positive patients certainly have additional comorbidities, specifically cardiovascular” ones, he told this news organization.

“When looking at the survival curve, survival was nearly identical at 5 years and only started to diverge at 10 years post-transplant,” he noted.

A further evaluation of patients with HIV who were co-infected with hepatitis C (HCV) showed that those with HIV-HCV co-infection prior to the center’s introduction of anti-HCV direct-acting antiviral (DAA) medications in 2014 had the lowest survival rate of all subgroups, at 57.1% at 5 years post-transplant (P = .045 vs. those treated after 2014).
 

 

 

Liver transplant patient survival similar

In terms of liver transplant outcomes, among 83 HIV-positive recipients who were propensity-matched with 468 HIV-negative recipients, the mean age was about 53 and about 66% were male.

The patient survival rates at 15 years were not significantly different between the groups, at 70% for HIV-positive and 75.7% for HIV-negative, (P = .12).

Similar to the kidney transplant recipients, the worst survival among all liver transplant subgroups was among HIV-HCV co-infected patients prior to access to HCV direct-acting antivirals in 2014, with a 5-year survival of 59.5% (P = .04).

“Since the advent of HCV direct-acting antivirals, liver transplant outcomes in HCV mono-infected patients are comparable to HCV/HIV co-infected recipients,” Dr. Stock said.
 

Acute rejection rates higher with HIV-positivity versus national averages

The rates of acute rejection at 1 year in the kidney and liver transplant, HIV-positive groups – at about 20% and 30%, respectively – were, however, higher than national average incidence rates of about 10% at 1 year.

Long-term data on those patients showed the acute rejection affected graft survival outcomes with kidney transplant recipients: HIV-positive kidney transplant recipients who had at least one episode of acute rejection had a graft survival of just 52.8% at 15 years post-transplant, compared with 91.8% among recipients without acute rejection.

Such differences were not observed among HIV-positive liver transplant recipients.

The authors note that the increased risk of acute rejection in HIV-positive kidney transplant patients is consistent with previous studies, with causes that may be multifactorial.

Top theories include drug interactions with protease inhibitors, resulting in some centers transitioning HIV-infected patients from those regimens to integrase-based regimens prior to transplant.

“The management and prevention of acute rejection in HIV-positive kidney transplant [patients] will therefore continue to be a key component in the care of these patients,” the authors note in their study.

The study was supported in part by the National Institutes of Health. The study authors and Dr. Klassen have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Peanut desensitization plummets 1 month after avoiding exposure

Article Type
Changed
Tue, 01/04/2022 - 14:53

Children with peanut allergies treated with peanut oral immunotherapy for 3 years can tolerate increasingly higher exposures to peanuts. But avoidance of peanut-protein exposure for just a single month after the treatment leads to rapid and substantial decreases in tolerance, findings from a small study show.

The findings “underscore the fact that the desensitization achieved with peanut oral immunotherapy is a transient immune state,” report the authors of the study, published in December in The Journal of Allergy and Clinical Immunology: In Practice.

Therefore, “adherence to dosing [in peanut immunotherapy] is very important, and clinicians should expect a decline in tolerance with lapse in dosing,” first author Carla M. Davis, MD, director of the Texas Children’s Hospital Food Allergy Program at Baylor College of Medicine, Houston, told this news organization.

Oral immunotherapy, involving small exposures to peanut protein to build up desensitization, has been shown to mitigate allergic reactions, and, as reported by this news organization, the first peanut oral immunotherapy drug recently received approval from the U.S. Food and Drug Administration.

However, current approaches involve very low daily exposure of about 300 mg of peanut protein, equivalent to only about one to two peanuts, and research is lacking regarding the maximum tolerated doses, as well as on how long the tolerance is sustained if maintenance therapy is discontinued. “For the peanut-allergic population that would like to eat more than 1-2 peanuts, an achievable dose is currently unknown,” the study authors write. “The critical question, of the maximum tolerated dose achieved after POIT, has not been answered.”

To evaluate those issues in their phase 2 study, Dr. Davis and her colleagues enrolled 28 subjects between the ages of 5 and 13 with a diagnosis of eosinophilic esophagitis and peanut allergy.

The treatment protocol included a 1-year buildup phase of oral immunotherapy, followed by a 2-year daily maintenance phase with a dose of 3,900 mg of peanut protein.

After consenting, 11 patients dropped out of the study due to a lack of interest, and two more withdrew after failing to tolerate their first dose, leaving 15 who started treatment in the study, with a mean age of 8.7 years (range, 5.2-12.5 years), and 47% female.

Twelve patients reached the maintenance dose of 3,900 mg over a median of 13 months, and double-blind, placebo-controlled peanut challenges showed that, on average, their mean maximum cumulative tolerated dose after 12 months increased by 12,063 mg (P < .001), and the mean dose triggering a reaction increased by 15,667 mg.

Of the 12 patients, 11 (91.7%) were able to successfully tolerate at least 10,725 mg after 12 months of treatment, and six patients (50.0%) successfully tolerated at least 15,225 mg.

Two patients were able to tolerate up to the maximum cumulative target dose of 26,225 mg, equivalent to more than 105 peanuts.

“The ability to tolerate [greater than] 100 peanuts following peanut oral immunotherapy has never before been demonstrated and gives insight into the potential for food oral immunotherapy to be utilized in a subset of patients who have an immunologic phenotype accepting of this therapy,” the authors write.

“Understanding the risk of ingestion of peanut protein higher than the prescribed peanut oral immunotherapy maintenance dose will improve the safe, practical use of [the therapy],” they add.
 

 

 

Tolerance plummets with avoidance

In the protocol’s third phase, after the 3-year buildup and maintenance therapy, daily peanut exposure was avoided for 30 days, and among the six patients who participated, the mean maximum cumulative tolerated dose declined to just 2,783 mg, and the reaction dose dropped to 4,614 mg (P = .03).

“This was a disappointing finding, because we thought the desensitization would last longer after such a long period of treatment,” Dr. Davis said.

While the avoidance period was only a month, Dr. Davis said she expects the rebound in sensitivity would continue if avoidance was prolonged. “Other studies indicate the decline in tolerance would continue over time, [and] we believe it would continue to decline,” she said.

Further analysis of peanut allergy biomarkers showed significant decreases in skin prick test wheal size and cytokine expression within the first 6 weeks of initiation of the peanut oral immunotherapy. The patterns were reversed during the 1-month avoidance, with both measures increasing.

Of note, the changes in biomarkers varied significantly among the participants.

In terms of adverse events, eight patients (53%) required one or two doses of epinephrine during the study, with all but two patients receiving the epinephrine during the 12-month buildup phase, consistent with previous studies.

In commenting on the study, Richard L. Wasserman, MD, PhD, medical director of pediatric allergy and immunology at Medical City Children’s Hospital, Dallas, noted that the findings pertain to the subset of peanut oral immunotherapy patients (about 30%) who want to be able to eat peanuts.

“Most families just want protection against accidental ingestion, and these observations don’t relate to those patients,” he said in an interview.

Dr. Wasserman noted that his approach with patients is to wait until 3 years of daily maintenance after buildup (as opposed to 2 years in the study) before considering an avoidance challenge.

“When our patients pass a sustained unresponsiveness challenge, we recommend continued exposure of 2,000 mg at least weekly,” he explained.

Dr. Wasserman added that the study’s findings on biomarker changes were notable.

“The eventual reduction in peanut serum IgE in all of their patients is very interesting,” he said. “Many of our patients’ peanut serum IgE plateaus after 2 or 3 years.”

And he added, “This report suggests that we should be making patients aware that they may further decrease their peanut serum IgE by increasing their maintenance dose.”

The study was funded by the Scurlock Foundation/Waring Family Foundation and the Texas Children’s Hospital food allergy program. Dr. Davis is a consultant for Aimmune, DBV, and Moonlight Therapeutics. Dr. Wasserman is a consultant for Aimmune and DBV.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Children with peanut allergies treated with peanut oral immunotherapy for 3 years can tolerate increasingly higher exposures to peanuts. But avoidance of peanut-protein exposure for just a single month after the treatment leads to rapid and substantial decreases in tolerance, findings from a small study show.

The findings “underscore the fact that the desensitization achieved with peanut oral immunotherapy is a transient immune state,” report the authors of the study, published in December in The Journal of Allergy and Clinical Immunology: In Practice.

Therefore, “adherence to dosing [in peanut immunotherapy] is very important, and clinicians should expect a decline in tolerance with lapse in dosing,” first author Carla M. Davis, MD, director of the Texas Children’s Hospital Food Allergy Program at Baylor College of Medicine, Houston, told this news organization.

Oral immunotherapy, involving small exposures to peanut protein to build up desensitization, has been shown to mitigate allergic reactions, and, as reported by this news organization, the first peanut oral immunotherapy drug recently received approval from the U.S. Food and Drug Administration.

However, current approaches involve very low daily exposure of about 300 mg of peanut protein, equivalent to only about one to two peanuts, and research is lacking regarding the maximum tolerated doses, as well as on how long the tolerance is sustained if maintenance therapy is discontinued. “For the peanut-allergic population that would like to eat more than 1-2 peanuts, an achievable dose is currently unknown,” the study authors write. “The critical question, of the maximum tolerated dose achieved after POIT, has not been answered.”

To evaluate those issues in their phase 2 study, Dr. Davis and her colleagues enrolled 28 subjects between the ages of 5 and 13 with a diagnosis of eosinophilic esophagitis and peanut allergy.

The treatment protocol included a 1-year buildup phase of oral immunotherapy, followed by a 2-year daily maintenance phase with a dose of 3,900 mg of peanut protein.

After consenting, 11 patients dropped out of the study due to a lack of interest, and two more withdrew after failing to tolerate their first dose, leaving 15 who started treatment in the study, with a mean age of 8.7 years (range, 5.2-12.5 years), and 47% female.

Twelve patients reached the maintenance dose of 3,900 mg over a median of 13 months, and double-blind, placebo-controlled peanut challenges showed that, on average, their mean maximum cumulative tolerated dose after 12 months increased by 12,063 mg (P < .001), and the mean dose triggering a reaction increased by 15,667 mg.

Of the 12 patients, 11 (91.7%) were able to successfully tolerate at least 10,725 mg after 12 months of treatment, and six patients (50.0%) successfully tolerated at least 15,225 mg.

Two patients were able to tolerate up to the maximum cumulative target dose of 26,225 mg, equivalent to more than 105 peanuts.

“The ability to tolerate [greater than] 100 peanuts following peanut oral immunotherapy has never before been demonstrated and gives insight into the potential for food oral immunotherapy to be utilized in a subset of patients who have an immunologic phenotype accepting of this therapy,” the authors write.

“Understanding the risk of ingestion of peanut protein higher than the prescribed peanut oral immunotherapy maintenance dose will improve the safe, practical use of [the therapy],” they add.
 

 

 

Tolerance plummets with avoidance

In the protocol’s third phase, after the 3-year buildup and maintenance therapy, daily peanut exposure was avoided for 30 days, and among the six patients who participated, the mean maximum cumulative tolerated dose declined to just 2,783 mg, and the reaction dose dropped to 4,614 mg (P = .03).

“This was a disappointing finding, because we thought the desensitization would last longer after such a long period of treatment,” Dr. Davis said.

While the avoidance period was only a month, Dr. Davis said she expects the rebound in sensitivity would continue if avoidance was prolonged. “Other studies indicate the decline in tolerance would continue over time, [and] we believe it would continue to decline,” she said.

Further analysis of peanut allergy biomarkers showed significant decreases in skin prick test wheal size and cytokine expression within the first 6 weeks of initiation of the peanut oral immunotherapy. The patterns were reversed during the 1-month avoidance, with both measures increasing.

Of note, the changes in biomarkers varied significantly among the participants.

In terms of adverse events, eight patients (53%) required one or two doses of epinephrine during the study, with all but two patients receiving the epinephrine during the 12-month buildup phase, consistent with previous studies.

In commenting on the study, Richard L. Wasserman, MD, PhD, medical director of pediatric allergy and immunology at Medical City Children’s Hospital, Dallas, noted that the findings pertain to the subset of peanut oral immunotherapy patients (about 30%) who want to be able to eat peanuts.

“Most families just want protection against accidental ingestion, and these observations don’t relate to those patients,” he said in an interview.

Dr. Wasserman noted that his approach with patients is to wait until 3 years of daily maintenance after buildup (as opposed to 2 years in the study) before considering an avoidance challenge.

“When our patients pass a sustained unresponsiveness challenge, we recommend continued exposure of 2,000 mg at least weekly,” he explained.

Dr. Wasserman added that the study’s findings on biomarker changes were notable.

“The eventual reduction in peanut serum IgE in all of their patients is very interesting,” he said. “Many of our patients’ peanut serum IgE plateaus after 2 or 3 years.”

And he added, “This report suggests that we should be making patients aware that they may further decrease their peanut serum IgE by increasing their maintenance dose.”

The study was funded by the Scurlock Foundation/Waring Family Foundation and the Texas Children’s Hospital food allergy program. Dr. Davis is a consultant for Aimmune, DBV, and Moonlight Therapeutics. Dr. Wasserman is a consultant for Aimmune and DBV.

A version of this article first appeared on Medscape.com.

Children with peanut allergies treated with peanut oral immunotherapy for 3 years can tolerate increasingly higher exposures to peanuts. But avoidance of peanut-protein exposure for just a single month after the treatment leads to rapid and substantial decreases in tolerance, findings from a small study show.

The findings “underscore the fact that the desensitization achieved with peanut oral immunotherapy is a transient immune state,” report the authors of the study, published in December in The Journal of Allergy and Clinical Immunology: In Practice.

Therefore, “adherence to dosing [in peanut immunotherapy] is very important, and clinicians should expect a decline in tolerance with lapse in dosing,” first author Carla M. Davis, MD, director of the Texas Children’s Hospital Food Allergy Program at Baylor College of Medicine, Houston, told this news organization.

Oral immunotherapy, involving small exposures to peanut protein to build up desensitization, has been shown to mitigate allergic reactions, and, as reported by this news organization, the first peanut oral immunotherapy drug recently received approval from the U.S. Food and Drug Administration.

However, current approaches involve very low daily exposure of about 300 mg of peanut protein, equivalent to only about one to two peanuts, and research is lacking regarding the maximum tolerated doses, as well as on how long the tolerance is sustained if maintenance therapy is discontinued. “For the peanut-allergic population that would like to eat more than 1-2 peanuts, an achievable dose is currently unknown,” the study authors write. “The critical question, of the maximum tolerated dose achieved after POIT, has not been answered.”

To evaluate those issues in their phase 2 study, Dr. Davis and her colleagues enrolled 28 subjects between the ages of 5 and 13 with a diagnosis of eosinophilic esophagitis and peanut allergy.

The treatment protocol included a 1-year buildup phase of oral immunotherapy, followed by a 2-year daily maintenance phase with a dose of 3,900 mg of peanut protein.

After consenting, 11 patients dropped out of the study due to a lack of interest, and two more withdrew after failing to tolerate their first dose, leaving 15 who started treatment in the study, with a mean age of 8.7 years (range, 5.2-12.5 years), and 47% female.

Twelve patients reached the maintenance dose of 3,900 mg over a median of 13 months, and double-blind, placebo-controlled peanut challenges showed that, on average, their mean maximum cumulative tolerated dose after 12 months increased by 12,063 mg (P < .001), and the mean dose triggering a reaction increased by 15,667 mg.

Of the 12 patients, 11 (91.7%) were able to successfully tolerate at least 10,725 mg after 12 months of treatment, and six patients (50.0%) successfully tolerated at least 15,225 mg.

Two patients were able to tolerate up to the maximum cumulative target dose of 26,225 mg, equivalent to more than 105 peanuts.

“The ability to tolerate [greater than] 100 peanuts following peanut oral immunotherapy has never before been demonstrated and gives insight into the potential for food oral immunotherapy to be utilized in a subset of patients who have an immunologic phenotype accepting of this therapy,” the authors write.

“Understanding the risk of ingestion of peanut protein higher than the prescribed peanut oral immunotherapy maintenance dose will improve the safe, practical use of [the therapy],” they add.
 

 

 

Tolerance plummets with avoidance

In the protocol’s third phase, after the 3-year buildup and maintenance therapy, daily peanut exposure was avoided for 30 days, and among the six patients who participated, the mean maximum cumulative tolerated dose declined to just 2,783 mg, and the reaction dose dropped to 4,614 mg (P = .03).

“This was a disappointing finding, because we thought the desensitization would last longer after such a long period of treatment,” Dr. Davis said.

While the avoidance period was only a month, Dr. Davis said she expects the rebound in sensitivity would continue if avoidance was prolonged. “Other studies indicate the decline in tolerance would continue over time, [and] we believe it would continue to decline,” she said.

Further analysis of peanut allergy biomarkers showed significant decreases in skin prick test wheal size and cytokine expression within the first 6 weeks of initiation of the peanut oral immunotherapy. The patterns were reversed during the 1-month avoidance, with both measures increasing.

Of note, the changes in biomarkers varied significantly among the participants.

In terms of adverse events, eight patients (53%) required one or two doses of epinephrine during the study, with all but two patients receiving the epinephrine during the 12-month buildup phase, consistent with previous studies.

In commenting on the study, Richard L. Wasserman, MD, PhD, medical director of pediatric allergy and immunology at Medical City Children’s Hospital, Dallas, noted that the findings pertain to the subset of peanut oral immunotherapy patients (about 30%) who want to be able to eat peanuts.

“Most families just want protection against accidental ingestion, and these observations don’t relate to those patients,” he said in an interview.

Dr. Wasserman noted that his approach with patients is to wait until 3 years of daily maintenance after buildup (as opposed to 2 years in the study) before considering an avoidance challenge.

“When our patients pass a sustained unresponsiveness challenge, we recommend continued exposure of 2,000 mg at least weekly,” he explained.

Dr. Wasserman added that the study’s findings on biomarker changes were notable.

“The eventual reduction in peanut serum IgE in all of their patients is very interesting,” he said. “Many of our patients’ peanut serum IgE plateaus after 2 or 3 years.”

And he added, “This report suggests that we should be making patients aware that they may further decrease their peanut serum IgE by increasing their maintenance dose.”

The study was funded by the Scurlock Foundation/Waring Family Foundation and the Texas Children’s Hospital food allergy program. Dr. Davis is a consultant for Aimmune, DBV, and Moonlight Therapeutics. Dr. Wasserman is a consultant for Aimmune and DBV.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

COVID-19 vaccinations in people with HIV reflect general rates despite higher mortality risk, study says

Article Type
Changed
Wed, 12/29/2021 - 09:41

 

Around the world, people with HIV show variations in COVID-19 vaccination rates similar to those seen in the general population, raising concerns because of their increased risk for morbidity and mortality from COVID-19 infection.

“To our knowledge, this analysis presents the first and largest investigation of vaccination rates among people with HIV,” reported the authors in research published in the Journal of Infectious Diseases.

The findings reflect data on nearly 7,000 people with HIV participating in the REPRIEVE clinical trial. As of July, COVID-19 vaccination rates ranged from a high of 71% in higher income regions to just 18% in sub-Saharan Africa and bottomed out at 0% in Haiti.

“This disparity in COVID-19 vaccination rates among people with HIV across income regions may increase morbidity from COVID-19 in the most vulnerable HIV populations,” the authors noted.

In general, people with HIV have been shown in recent research to have as much as 29% higher odds of morality from COVID-19 than the general population, and a 20% higher odds of hospitalization, hence their need for vaccination is especially pressing.

To understand the vaccination rates, the authors looked at data from the ongoing REPRIEVE trial, designed to investigate primary cardiovascular prevention worldwide among people with HIV. The trial includes data on COVID-19 vaccination status, providing a unique opportunity to capture those rates.

The study specifically included 6,952 people with HIV aged 40-75 years and on stable antiretroviral therapy (ART), without known cardiovascular disease, and a low to moderate atherosclerotic cardiovascular disease (ASCVD) risk.

The diverse participants with HIV were from 12 countries, including 66% who were people of color, as well as 32% women. Countries represented include Brazil (n = 1,042), Botswana (n = 273), Canada (n = 123), Haiti (n = 136), India (n = 469), Peru (n = 142), South Africa (n = 527), Spain (n = 198), Thailand (n = 582), Uganda (n = 175), United States (n = 3,162), and Zimbabwe (n = 123).

With vaccination defined as having received at least one vaccine shot, the overall cumulative COVID-19 vaccination rate in the study was 55% through July 2021.

By region, the highest cumulative rates were in the high-income countries of the United States and Canada (71%), followed by Latin America and the Caribbean (59%) – all consistent with the general population in these areas

Lower cumulative vaccination rates were observed in South Asia (49%), Southeast/East Asia (41%), and sub-Saharan Africa (18%), also reflecting the regional vaccination rates.

The United States had the highest country-specific COVID-19 vaccination rate of 72%, followed by Peru (69%) and Brazil (63%). Countries with the lowest vaccination rates were South Africa (18%), Uganda (3%), and Haiti (0%).

Of note, South Africa and Botswana have the largest share of deaths from HIV/AIDS, and both had very low COVID-19 vaccination rates in general, compared with high-income countries.

Overall, factors linked to the likelihood of being vaccinated included residing in the high-income U.S./Canada Global Burden of Disease superregion, as well as being White, male, older, having a higher body mass index (BMI), a higher ASCVD risk score, and longer duration of ART.

Participants’ decisions regarding COVID-19 vaccination in the study were made individually and were not based on any study-related recommendations or requirements, the authors noted.

Vaccination rates were higher among men than women in most regions, with the exception of sub-Saharan Africa. Vaccination rates were higher among Whites than Blacks in the U.S./Canada high-income region, with a high proportion of participants from the United States.

“It was surprising to us – and unfortunate – that in the high-income superregion vaccination rates were higher among individuals who identified as White than those who identified as Black and among men,” senior author Steven K. Grinspoon, MD, said in an interview.

“Given data for higher morbidity from COVID-19 among people of color with HIV, this disparity is likely to have significant public health implications,” said Dr. Grinspoon, a professor of medicine at Harvard Medical School and chief of the metabolism unit at Massachusetts General Hospital, both in Boston.

Newer data from the REPRIEVE study through October has shown continued steady increases in the cumulative vaccination rates in all regions, Dr. Grinspoon noted, with the largest increases in the Southeast/East Asia, South Asia, and sub-Saharan Africa, whereas a leveling off of rates was observed in the high-income regions.

Overall, “it is encouraging that rates among people with HIV are similar to and, in many regions, higher than the general population,” Dr. Grinspoon said.

However, with the data showing a higher risk for COVID-19 death in people with HIV, “it is critical that people with HIV, representing a vulnerable and immunocompromised population, be vaccinated for COVID-19,” Dr. Grinspoon said.

Commenting on the study, Monica Gandhi, MD, MPH, director of the Gladstone Center for AIDS Research at the University of California, San Francisco, agreed that “it is encouraging that these rates are as high as the general population, showing that there is not excess hesitancy among those living with HIV.”

Unlike other immunocompromised groups, people with HIV were not necessarily prioritized for vaccination, since antiretroviral therapy can reconstitute the immune system, “so I am not surprised the [vaccination] rates aren’t higher,” Dr. Gandhi, who was not involved with the study, said in an interview.

Nevertheless, “it is important that those with risk factors for more severe disease, such as higher BMI and higher cardiovascular disease, are prioritized for COVID-19 vaccination, [as] these are important groups in which to increase rates,” she said.

“The take-home message is that we have to increase our rates of vaccination in this critically important population,” Dr. Gandhi emphasized. “Global vaccine equity is paramount given that the burden of HIV infections remains in sub-Saharan Africa.”

The study received support from the National Institutes of Health and funding from Kowa Pharmaceuticals and Gilead Sciences. The authors and Dr. Gandhi disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Around the world, people with HIV show variations in COVID-19 vaccination rates similar to those seen in the general population, raising concerns because of their increased risk for morbidity and mortality from COVID-19 infection.

“To our knowledge, this analysis presents the first and largest investigation of vaccination rates among people with HIV,” reported the authors in research published in the Journal of Infectious Diseases.

The findings reflect data on nearly 7,000 people with HIV participating in the REPRIEVE clinical trial. As of July, COVID-19 vaccination rates ranged from a high of 71% in higher income regions to just 18% in sub-Saharan Africa and bottomed out at 0% in Haiti.

“This disparity in COVID-19 vaccination rates among people with HIV across income regions may increase morbidity from COVID-19 in the most vulnerable HIV populations,” the authors noted.

In general, people with HIV have been shown in recent research to have as much as 29% higher odds of morality from COVID-19 than the general population, and a 20% higher odds of hospitalization, hence their need for vaccination is especially pressing.

To understand the vaccination rates, the authors looked at data from the ongoing REPRIEVE trial, designed to investigate primary cardiovascular prevention worldwide among people with HIV. The trial includes data on COVID-19 vaccination status, providing a unique opportunity to capture those rates.

The study specifically included 6,952 people with HIV aged 40-75 years and on stable antiretroviral therapy (ART), without known cardiovascular disease, and a low to moderate atherosclerotic cardiovascular disease (ASCVD) risk.

The diverse participants with HIV were from 12 countries, including 66% who were people of color, as well as 32% women. Countries represented include Brazil (n = 1,042), Botswana (n = 273), Canada (n = 123), Haiti (n = 136), India (n = 469), Peru (n = 142), South Africa (n = 527), Spain (n = 198), Thailand (n = 582), Uganda (n = 175), United States (n = 3,162), and Zimbabwe (n = 123).

With vaccination defined as having received at least one vaccine shot, the overall cumulative COVID-19 vaccination rate in the study was 55% through July 2021.

By region, the highest cumulative rates were in the high-income countries of the United States and Canada (71%), followed by Latin America and the Caribbean (59%) – all consistent with the general population in these areas

Lower cumulative vaccination rates were observed in South Asia (49%), Southeast/East Asia (41%), and sub-Saharan Africa (18%), also reflecting the regional vaccination rates.

The United States had the highest country-specific COVID-19 vaccination rate of 72%, followed by Peru (69%) and Brazil (63%). Countries with the lowest vaccination rates were South Africa (18%), Uganda (3%), and Haiti (0%).

Of note, South Africa and Botswana have the largest share of deaths from HIV/AIDS, and both had very low COVID-19 vaccination rates in general, compared with high-income countries.

Overall, factors linked to the likelihood of being vaccinated included residing in the high-income U.S./Canada Global Burden of Disease superregion, as well as being White, male, older, having a higher body mass index (BMI), a higher ASCVD risk score, and longer duration of ART.

Participants’ decisions regarding COVID-19 vaccination in the study were made individually and were not based on any study-related recommendations or requirements, the authors noted.

Vaccination rates were higher among men than women in most regions, with the exception of sub-Saharan Africa. Vaccination rates were higher among Whites than Blacks in the U.S./Canada high-income region, with a high proportion of participants from the United States.

“It was surprising to us – and unfortunate – that in the high-income superregion vaccination rates were higher among individuals who identified as White than those who identified as Black and among men,” senior author Steven K. Grinspoon, MD, said in an interview.

“Given data for higher morbidity from COVID-19 among people of color with HIV, this disparity is likely to have significant public health implications,” said Dr. Grinspoon, a professor of medicine at Harvard Medical School and chief of the metabolism unit at Massachusetts General Hospital, both in Boston.

Newer data from the REPRIEVE study through October has shown continued steady increases in the cumulative vaccination rates in all regions, Dr. Grinspoon noted, with the largest increases in the Southeast/East Asia, South Asia, and sub-Saharan Africa, whereas a leveling off of rates was observed in the high-income regions.

Overall, “it is encouraging that rates among people with HIV are similar to and, in many regions, higher than the general population,” Dr. Grinspoon said.

However, with the data showing a higher risk for COVID-19 death in people with HIV, “it is critical that people with HIV, representing a vulnerable and immunocompromised population, be vaccinated for COVID-19,” Dr. Grinspoon said.

Commenting on the study, Monica Gandhi, MD, MPH, director of the Gladstone Center for AIDS Research at the University of California, San Francisco, agreed that “it is encouraging that these rates are as high as the general population, showing that there is not excess hesitancy among those living with HIV.”

Unlike other immunocompromised groups, people with HIV were not necessarily prioritized for vaccination, since antiretroviral therapy can reconstitute the immune system, “so I am not surprised the [vaccination] rates aren’t higher,” Dr. Gandhi, who was not involved with the study, said in an interview.

Nevertheless, “it is important that those with risk factors for more severe disease, such as higher BMI and higher cardiovascular disease, are prioritized for COVID-19 vaccination, [as] these are important groups in which to increase rates,” she said.

“The take-home message is that we have to increase our rates of vaccination in this critically important population,” Dr. Gandhi emphasized. “Global vaccine equity is paramount given that the burden of HIV infections remains in sub-Saharan Africa.”

The study received support from the National Institutes of Health and funding from Kowa Pharmaceuticals and Gilead Sciences. The authors and Dr. Gandhi disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Around the world, people with HIV show variations in COVID-19 vaccination rates similar to those seen in the general population, raising concerns because of their increased risk for morbidity and mortality from COVID-19 infection.

“To our knowledge, this analysis presents the first and largest investigation of vaccination rates among people with HIV,” reported the authors in research published in the Journal of Infectious Diseases.

The findings reflect data on nearly 7,000 people with HIV participating in the REPRIEVE clinical trial. As of July, COVID-19 vaccination rates ranged from a high of 71% in higher income regions to just 18% in sub-Saharan Africa and bottomed out at 0% in Haiti.

“This disparity in COVID-19 vaccination rates among people with HIV across income regions may increase morbidity from COVID-19 in the most vulnerable HIV populations,” the authors noted.

In general, people with HIV have been shown in recent research to have as much as 29% higher odds of morality from COVID-19 than the general population, and a 20% higher odds of hospitalization, hence their need for vaccination is especially pressing.

To understand the vaccination rates, the authors looked at data from the ongoing REPRIEVE trial, designed to investigate primary cardiovascular prevention worldwide among people with HIV. The trial includes data on COVID-19 vaccination status, providing a unique opportunity to capture those rates.

The study specifically included 6,952 people with HIV aged 40-75 years and on stable antiretroviral therapy (ART), without known cardiovascular disease, and a low to moderate atherosclerotic cardiovascular disease (ASCVD) risk.

The diverse participants with HIV were from 12 countries, including 66% who were people of color, as well as 32% women. Countries represented include Brazil (n = 1,042), Botswana (n = 273), Canada (n = 123), Haiti (n = 136), India (n = 469), Peru (n = 142), South Africa (n = 527), Spain (n = 198), Thailand (n = 582), Uganda (n = 175), United States (n = 3,162), and Zimbabwe (n = 123).

With vaccination defined as having received at least one vaccine shot, the overall cumulative COVID-19 vaccination rate in the study was 55% through July 2021.

By region, the highest cumulative rates were in the high-income countries of the United States and Canada (71%), followed by Latin America and the Caribbean (59%) – all consistent with the general population in these areas

Lower cumulative vaccination rates were observed in South Asia (49%), Southeast/East Asia (41%), and sub-Saharan Africa (18%), also reflecting the regional vaccination rates.

The United States had the highest country-specific COVID-19 vaccination rate of 72%, followed by Peru (69%) and Brazil (63%). Countries with the lowest vaccination rates were South Africa (18%), Uganda (3%), and Haiti (0%).

Of note, South Africa and Botswana have the largest share of deaths from HIV/AIDS, and both had very low COVID-19 vaccination rates in general, compared with high-income countries.

Overall, factors linked to the likelihood of being vaccinated included residing in the high-income U.S./Canada Global Burden of Disease superregion, as well as being White, male, older, having a higher body mass index (BMI), a higher ASCVD risk score, and longer duration of ART.

Participants’ decisions regarding COVID-19 vaccination in the study were made individually and were not based on any study-related recommendations or requirements, the authors noted.

Vaccination rates were higher among men than women in most regions, with the exception of sub-Saharan Africa. Vaccination rates were higher among Whites than Blacks in the U.S./Canada high-income region, with a high proportion of participants from the United States.

“It was surprising to us – and unfortunate – that in the high-income superregion vaccination rates were higher among individuals who identified as White than those who identified as Black and among men,” senior author Steven K. Grinspoon, MD, said in an interview.

“Given data for higher morbidity from COVID-19 among people of color with HIV, this disparity is likely to have significant public health implications,” said Dr. Grinspoon, a professor of medicine at Harvard Medical School and chief of the metabolism unit at Massachusetts General Hospital, both in Boston.

Newer data from the REPRIEVE study through October has shown continued steady increases in the cumulative vaccination rates in all regions, Dr. Grinspoon noted, with the largest increases in the Southeast/East Asia, South Asia, and sub-Saharan Africa, whereas a leveling off of rates was observed in the high-income regions.

Overall, “it is encouraging that rates among people with HIV are similar to and, in many regions, higher than the general population,” Dr. Grinspoon said.

However, with the data showing a higher risk for COVID-19 death in people with HIV, “it is critical that people with HIV, representing a vulnerable and immunocompromised population, be vaccinated for COVID-19,” Dr. Grinspoon said.

Commenting on the study, Monica Gandhi, MD, MPH, director of the Gladstone Center for AIDS Research at the University of California, San Francisco, agreed that “it is encouraging that these rates are as high as the general population, showing that there is not excess hesitancy among those living with HIV.”

Unlike other immunocompromised groups, people with HIV were not necessarily prioritized for vaccination, since antiretroviral therapy can reconstitute the immune system, “so I am not surprised the [vaccination] rates aren’t higher,” Dr. Gandhi, who was not involved with the study, said in an interview.

Nevertheless, “it is important that those with risk factors for more severe disease, such as higher BMI and higher cardiovascular disease, are prioritized for COVID-19 vaccination, [as] these are important groups in which to increase rates,” she said.

“The take-home message is that we have to increase our rates of vaccination in this critically important population,” Dr. Gandhi emphasized. “Global vaccine equity is paramount given that the burden of HIV infections remains in sub-Saharan Africa.”

The study received support from the National Institutes of Health and funding from Kowa Pharmaceuticals and Gilead Sciences. The authors and Dr. Gandhi disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF INFECTIOUS DISEASES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Postmenopausal women with early breast cancer can go chemo-free

Article Type
Changed
Thu, 12/15/2022 - 17:25

New results from the phase 3 RxPONDER trial add to mounting evidence that most postmenopausal women with early-stage breast cancer derive no added benefits from chemotherapy and can be effectively treated with endocrine therapy alone.

The study, published in The New England Journal of Medicine, conversely shows that premenopausal women do benefit from adjuvant chemotherapy, theorized by many to largely be the result of chemotherapy-induced ovarian function suppression.

The RxPONDER trial results are in line with those from the practice-changing TAILORx trial and underscore that “postmenopausal women with 1 to 3 positive nodes and [a recurrence score] of 0 to 25 can likely safely forgo adjuvant chemotherapy without compromising invasive disease-free survival,” first author Kevin Kalinsky, MD, of the Winship Cancer Institute at Emory University, Atlanta, told this news organization. “This will save tens of thousands of women the time, expense, and potentially harmful side effects that can be associated with chemotherapy infusions.”

However, the authors note, “premenopausal women with 1-3 positive lymph nodes had a significant benefit from chemotherapy.”

The study, conducted by the Southwest Oncology Group (SWOG) Cancer Research Network, involved 5,018 women with hormone receptor (HR)-positive, human epidermal growth factor receptor 2 (HER2)-negative breast cancer with one to three positive axillary lymph nodes – a breast cancer profile that represents approximately 20% of cases in the U.S.

All women had recurrence scores on the 100-point 21-gene breast cancer assay (Oncotype Dx) under 25, which is considered the lowest risk of recurrence. Patients were randomized to treatment with endocrine therapy only (n = 2,507) or chemotherapy followed by endocrine therapy (n = 2,511).

After a median follow-up of 5.3 years, women treated with adjunctive chemotherapy plus endocrine therapy exhibited no significant improvements in invasive disease-free survival compared to those who received endocrine therapy alone.

A prespecified analysis stratifying women by menopausal status underscored those results among postmenopausal women. In this cohort, researchers reported invasive disease-free survival was 91.9% in the endocrine-only group and 91.3% in the chemotherapy group (HR, 1.02; P = .89), indicating no benefit of the adjunctive chemotherapy.

However, among premenopausal women, the invasive disease-free survival rate was significantly higher with the addition of chemotherapy – 89.0% with endocrine-only therapy and 93.9% with both therapies (HR, 0.60; P = .002). Increases in distant relapse-free survival observed in the dual-therapy group similarly favored adding chemotherapy (HR, 0.58; P = .009).

Even when the authors further stratified the women into recurrence scores of 0 to 13 or 14 to 25, the results remained consistent. Postmenopausal women in each of the recurrence score groups continued to show no difference in invasive disease recurrence, new primary cancer, or death from chemotherapy (HR, 1.01 for each score group). Conversely, premenopausal women showed significant improvements in those outcomes when chemotherapy was added to endocrine therapy.

To what degree were the effects observed in premenopausal women the result of chemotherapy-induced ovarian suppression?

“I think it’s fair to say it’s the most interesting question right now in early-stage breast cancer for ER-positive tumors,” Harold Burstein, MD, of the Dana-Farber Cancer Institute and Harvard Medical School, Boston, said during a debate at the recent San Antonio Breast Cancer Symposium.

According to Sibylle Loibl, MD, PhD, when it comes to the use of chemotherapy, “age matters.”

“I strongly believe the biology of tumors is different in younger women with HR-positive/HER2-negative breast cancer,” Dr. Loibl, an associate professor at the University of Frankfurt, said during the debate. “It’s a different disease and the effects of chemotherapy are different.”

In young women, chemotherapy has “a direct cytotoxic effect, which cannot be neglected, and an endocrine effect on ovarian function suppression,” Dr. Loibl added. “I think both are needed in young premenopausal patients.”

According to the RxPONDER authors, “whether a chemotherapy benefit in premenopausal women is due to both direct cytocidal effects and treatment-induced menopause remains unclear,” but they noted that “it is possible that the contribution of these mechanisms may vary according to age.”

Further complicating matters, Dr. Loibl added, is that age appears to be poorly represented in genetic diagnostic tools.

“I think the gene expression profiles we are currently using as standard diagnostic tools do not capture the right biology for our premenopausal patients,” she said. “We have to keep in mind that these tests were designed and validated in postmenopausal patients and were only retrospectively used in premenopausal patients.”

The study was funded by the National Cancer Institute and others. Dr. Loibl has received honoraria from Prime and Chugai and numerous institutional research grants.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

New results from the phase 3 RxPONDER trial add to mounting evidence that most postmenopausal women with early-stage breast cancer derive no added benefits from chemotherapy and can be effectively treated with endocrine therapy alone.

The study, published in The New England Journal of Medicine, conversely shows that premenopausal women do benefit from adjuvant chemotherapy, theorized by many to largely be the result of chemotherapy-induced ovarian function suppression.

The RxPONDER trial results are in line with those from the practice-changing TAILORx trial and underscore that “postmenopausal women with 1 to 3 positive nodes and [a recurrence score] of 0 to 25 can likely safely forgo adjuvant chemotherapy without compromising invasive disease-free survival,” first author Kevin Kalinsky, MD, of the Winship Cancer Institute at Emory University, Atlanta, told this news organization. “This will save tens of thousands of women the time, expense, and potentially harmful side effects that can be associated with chemotherapy infusions.”

However, the authors note, “premenopausal women with 1-3 positive lymph nodes had a significant benefit from chemotherapy.”

The study, conducted by the Southwest Oncology Group (SWOG) Cancer Research Network, involved 5,018 women with hormone receptor (HR)-positive, human epidermal growth factor receptor 2 (HER2)-negative breast cancer with one to three positive axillary lymph nodes – a breast cancer profile that represents approximately 20% of cases in the U.S.

All women had recurrence scores on the 100-point 21-gene breast cancer assay (Oncotype Dx) under 25, which is considered the lowest risk of recurrence. Patients were randomized to treatment with endocrine therapy only (n = 2,507) or chemotherapy followed by endocrine therapy (n = 2,511).

After a median follow-up of 5.3 years, women treated with adjunctive chemotherapy plus endocrine therapy exhibited no significant improvements in invasive disease-free survival compared to those who received endocrine therapy alone.

A prespecified analysis stratifying women by menopausal status underscored those results among postmenopausal women. In this cohort, researchers reported invasive disease-free survival was 91.9% in the endocrine-only group and 91.3% in the chemotherapy group (HR, 1.02; P = .89), indicating no benefit of the adjunctive chemotherapy.

However, among premenopausal women, the invasive disease-free survival rate was significantly higher with the addition of chemotherapy – 89.0% with endocrine-only therapy and 93.9% with both therapies (HR, 0.60; P = .002). Increases in distant relapse-free survival observed in the dual-therapy group similarly favored adding chemotherapy (HR, 0.58; P = .009).

Even when the authors further stratified the women into recurrence scores of 0 to 13 or 14 to 25, the results remained consistent. Postmenopausal women in each of the recurrence score groups continued to show no difference in invasive disease recurrence, new primary cancer, or death from chemotherapy (HR, 1.01 for each score group). Conversely, premenopausal women showed significant improvements in those outcomes when chemotherapy was added to endocrine therapy.

To what degree were the effects observed in premenopausal women the result of chemotherapy-induced ovarian suppression?

“I think it’s fair to say it’s the most interesting question right now in early-stage breast cancer for ER-positive tumors,” Harold Burstein, MD, of the Dana-Farber Cancer Institute and Harvard Medical School, Boston, said during a debate at the recent San Antonio Breast Cancer Symposium.

According to Sibylle Loibl, MD, PhD, when it comes to the use of chemotherapy, “age matters.”

“I strongly believe the biology of tumors is different in younger women with HR-positive/HER2-negative breast cancer,” Dr. Loibl, an associate professor at the University of Frankfurt, said during the debate. “It’s a different disease and the effects of chemotherapy are different.”

In young women, chemotherapy has “a direct cytotoxic effect, which cannot be neglected, and an endocrine effect on ovarian function suppression,” Dr. Loibl added. “I think both are needed in young premenopausal patients.”

According to the RxPONDER authors, “whether a chemotherapy benefit in premenopausal women is due to both direct cytocidal effects and treatment-induced menopause remains unclear,” but they noted that “it is possible that the contribution of these mechanisms may vary according to age.”

Further complicating matters, Dr. Loibl added, is that age appears to be poorly represented in genetic diagnostic tools.

“I think the gene expression profiles we are currently using as standard diagnostic tools do not capture the right biology for our premenopausal patients,” she said. “We have to keep in mind that these tests were designed and validated in postmenopausal patients and were only retrospectively used in premenopausal patients.”

The study was funded by the National Cancer Institute and others. Dr. Loibl has received honoraria from Prime and Chugai and numerous institutional research grants.

A version of this article first appeared on Medscape.com.

New results from the phase 3 RxPONDER trial add to mounting evidence that most postmenopausal women with early-stage breast cancer derive no added benefits from chemotherapy and can be effectively treated with endocrine therapy alone.

The study, published in The New England Journal of Medicine, conversely shows that premenopausal women do benefit from adjuvant chemotherapy, theorized by many to largely be the result of chemotherapy-induced ovarian function suppression.

The RxPONDER trial results are in line with those from the practice-changing TAILORx trial and underscore that “postmenopausal women with 1 to 3 positive nodes and [a recurrence score] of 0 to 25 can likely safely forgo adjuvant chemotherapy without compromising invasive disease-free survival,” first author Kevin Kalinsky, MD, of the Winship Cancer Institute at Emory University, Atlanta, told this news organization. “This will save tens of thousands of women the time, expense, and potentially harmful side effects that can be associated with chemotherapy infusions.”

However, the authors note, “premenopausal women with 1-3 positive lymph nodes had a significant benefit from chemotherapy.”

The study, conducted by the Southwest Oncology Group (SWOG) Cancer Research Network, involved 5,018 women with hormone receptor (HR)-positive, human epidermal growth factor receptor 2 (HER2)-negative breast cancer with one to three positive axillary lymph nodes – a breast cancer profile that represents approximately 20% of cases in the U.S.

All women had recurrence scores on the 100-point 21-gene breast cancer assay (Oncotype Dx) under 25, which is considered the lowest risk of recurrence. Patients were randomized to treatment with endocrine therapy only (n = 2,507) or chemotherapy followed by endocrine therapy (n = 2,511).

After a median follow-up of 5.3 years, women treated with adjunctive chemotherapy plus endocrine therapy exhibited no significant improvements in invasive disease-free survival compared to those who received endocrine therapy alone.

A prespecified analysis stratifying women by menopausal status underscored those results among postmenopausal women. In this cohort, researchers reported invasive disease-free survival was 91.9% in the endocrine-only group and 91.3% in the chemotherapy group (HR, 1.02; P = .89), indicating no benefit of the adjunctive chemotherapy.

However, among premenopausal women, the invasive disease-free survival rate was significantly higher with the addition of chemotherapy – 89.0% with endocrine-only therapy and 93.9% with both therapies (HR, 0.60; P = .002). Increases in distant relapse-free survival observed in the dual-therapy group similarly favored adding chemotherapy (HR, 0.58; P = .009).

Even when the authors further stratified the women into recurrence scores of 0 to 13 or 14 to 25, the results remained consistent. Postmenopausal women in each of the recurrence score groups continued to show no difference in invasive disease recurrence, new primary cancer, or death from chemotherapy (HR, 1.01 for each score group). Conversely, premenopausal women showed significant improvements in those outcomes when chemotherapy was added to endocrine therapy.

To what degree were the effects observed in premenopausal women the result of chemotherapy-induced ovarian suppression?

“I think it’s fair to say it’s the most interesting question right now in early-stage breast cancer for ER-positive tumors,” Harold Burstein, MD, of the Dana-Farber Cancer Institute and Harvard Medical School, Boston, said during a debate at the recent San Antonio Breast Cancer Symposium.

According to Sibylle Loibl, MD, PhD, when it comes to the use of chemotherapy, “age matters.”

“I strongly believe the biology of tumors is different in younger women with HR-positive/HER2-negative breast cancer,” Dr. Loibl, an associate professor at the University of Frankfurt, said during the debate. “It’s a different disease and the effects of chemotherapy are different.”

In young women, chemotherapy has “a direct cytotoxic effect, which cannot be neglected, and an endocrine effect on ovarian function suppression,” Dr. Loibl added. “I think both are needed in young premenopausal patients.”

According to the RxPONDER authors, “whether a chemotherapy benefit in premenopausal women is due to both direct cytocidal effects and treatment-induced menopause remains unclear,” but they noted that “it is possible that the contribution of these mechanisms may vary according to age.”

Further complicating matters, Dr. Loibl added, is that age appears to be poorly represented in genetic diagnostic tools.

“I think the gene expression profiles we are currently using as standard diagnostic tools do not capture the right biology for our premenopausal patients,” she said. “We have to keep in mind that these tests were designed and validated in postmenopausal patients and were only retrospectively used in premenopausal patients.”

The study was funded by the National Cancer Institute and others. Dr. Loibl has received honoraria from Prime and Chugai and numerous institutional research grants.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article