Expert says progress in gut-brain research requires an open mind

Article Type
Changed
Fri, 03/13/2020 - 14:25

A growing body of research links the gut with the brain and behavior, but compartmentalization within the medical community may be slowing investigation of the gut-brain axis, according to a leading expert.

Studies have shown that the microbiome may influence a diverse range of behavioral and neurological processes, from acute and chronic stress responses to development of Parkinson’s and Alzheimer’s disease, reported John F. Cryan, PhD, of University College Cork, Ireland.

Dr. Cryan began his presentation at the annual Gut Microbiota for Health World Summit by citing Hippocrates, who is thought to have stated that all diseases begin in the gut.

“That can be quite strange when I talk to my neurology or psychiatry colleagues,” Dr. Cryan said. “They sometimes look at me like I have two heads. Because in medicine we compartmentalize, and if you are studying neurology or psychiatry or [you are] in clinical practice, you are focusing on everything from the neck upwards.”

For more than a decade, Dr. Cryan and colleagues have been investigating the gut-brain axis, predominantly in mouse models, but also across animal species and in humans.

At the meeting, sponsored by the American Gastroenterological Association and the European Society for Neurogastroenterology and Motility, Dr. Cryan reviewed a variety of representative studies.

For instance, in both mice and humans, research has shown that C-section, which is associated with poorer microbiome diversity than vaginal delivery, has also been linked with social deficits and elevated stress responses. And in the case of mice, coprophagia, in which cesarean-delivered mice eat the feces of vaginally born mice, has been shown to ameliorate these psychiatric effects.

Dr. Cryan likened this process to an “artificial fecal transplant.”

“You know, co-housing and eating each other’s poo is not the translational approach that we were advocating by any means,” Dr. Cryan said. “But at least it tells us – in a proof-of-concept way – that if we change the microbiome, then we can reverse what’s going on.”

While the mechanisms behind the gut-brain axis remain incompletely understood, Dr. Cryan noted that the vagus nerve, which travels from the gut to the brain, plays a central role, and that transecting this nerve in mice stops the microbiome from affecting the brain.

“What happens in vagus doesn’t just stay in vagus, but will actually affect our emotions in different ways,” Dr. Cryan said.

He emphasized that communication travels both ways along the gut-brain axis, and went on to describe how this phenomenon has been demonstrated across a wide array of animals.

“From insects all the way through to primates, if you start to interfere with social behavior, you change the microbiome,” Dr. Cryan said. “But the opposite is also true; if you start to change the microbiome you can start to have widespread effects on social behavior.”

In humans, manipulating the microbiome could open up new psychiatric frontiers, Dr. Cryan said.

“[In the past 30 years], there really have been no real advances in how we manage mental health,” he said. “That’s very sobering when we are having such a mental health problem across all ages right now. And so perhaps it’s time for what we’ve coined the ‘psychobiotic revolution’ – time for a new way of thinking about mental health.”

According to Dr. Cryan, psychobiotics are interventions that target the microbiome for mental health purposes, including fermented foods, probiotics, prebiotics, synbiotics, parabiotics, and postbiotics.

Among these, probiotics have been a focal point of interventional research. Although results have been mixed, Dr. Cryan suggested that negative probiotic studies are more likely due to bacterial strain than a failure of the concept as a whole.

“Most strains of bacteria will do absolutely nothing,” Dr. Cryan said. “Strain is really important.”

In demonstration of this concept, he recounted a 2017 study conducted at University College Cork in which 22 healthy volunteers were given Bifidobacterium longum 1714, and then subjected to a social stress test. The results, published in Translational Psychiatry, showed that the probiotic, compared with placebo, was associated with attenuated stress responses, reduced daily stress, and enhanced visuospatial memory.

In contrast, a similar study by Dr. Cryan and colleagues, which tested Lactobacillus rhamnosus (JB-1), fell short.

“You [could not have gotten] more negative data into one paper if you tried,” Dr. Cryan said, referring to the study. “It did absolutely nothing.”

To find out which psychobiotics may have an impact, and how, Dr. Cryan called for more research.

“It’s still early days,” he said. “We probably have more meta-analyses and systematic reviews of the field than we have primary research papers.

Dr. Cryan concluded his presentation on an optimistic note.

“Neurology is waking up ... to understand that the microbiome could be playing a key role in many, many other disorders. ... Overall, what we’re beginning to see is that our state of gut markedly affects our state of mind.”

Dr. Cryan disclosed relationships with Abbott Nutrition, Roche Pharma, Nutricia, and others.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

A growing body of research links the gut with the brain and behavior, but compartmentalization within the medical community may be slowing investigation of the gut-brain axis, according to a leading expert.

Studies have shown that the microbiome may influence a diverse range of behavioral and neurological processes, from acute and chronic stress responses to development of Parkinson’s and Alzheimer’s disease, reported John F. Cryan, PhD, of University College Cork, Ireland.

Dr. Cryan began his presentation at the annual Gut Microbiota for Health World Summit by citing Hippocrates, who is thought to have stated that all diseases begin in the gut.

“That can be quite strange when I talk to my neurology or psychiatry colleagues,” Dr. Cryan said. “They sometimes look at me like I have two heads. Because in medicine we compartmentalize, and if you are studying neurology or psychiatry or [you are] in clinical practice, you are focusing on everything from the neck upwards.”

For more than a decade, Dr. Cryan and colleagues have been investigating the gut-brain axis, predominantly in mouse models, but also across animal species and in humans.

At the meeting, sponsored by the American Gastroenterological Association and the European Society for Neurogastroenterology and Motility, Dr. Cryan reviewed a variety of representative studies.

For instance, in both mice and humans, research has shown that C-section, which is associated with poorer microbiome diversity than vaginal delivery, has also been linked with social deficits and elevated stress responses. And in the case of mice, coprophagia, in which cesarean-delivered mice eat the feces of vaginally born mice, has been shown to ameliorate these psychiatric effects.

Dr. Cryan likened this process to an “artificial fecal transplant.”

“You know, co-housing and eating each other’s poo is not the translational approach that we were advocating by any means,” Dr. Cryan said. “But at least it tells us – in a proof-of-concept way – that if we change the microbiome, then we can reverse what’s going on.”

While the mechanisms behind the gut-brain axis remain incompletely understood, Dr. Cryan noted that the vagus nerve, which travels from the gut to the brain, plays a central role, and that transecting this nerve in mice stops the microbiome from affecting the brain.

“What happens in vagus doesn’t just stay in vagus, but will actually affect our emotions in different ways,” Dr. Cryan said.

He emphasized that communication travels both ways along the gut-brain axis, and went on to describe how this phenomenon has been demonstrated across a wide array of animals.

“From insects all the way through to primates, if you start to interfere with social behavior, you change the microbiome,” Dr. Cryan said. “But the opposite is also true; if you start to change the microbiome you can start to have widespread effects on social behavior.”

In humans, manipulating the microbiome could open up new psychiatric frontiers, Dr. Cryan said.

“[In the past 30 years], there really have been no real advances in how we manage mental health,” he said. “That’s very sobering when we are having such a mental health problem across all ages right now. And so perhaps it’s time for what we’ve coined the ‘psychobiotic revolution’ – time for a new way of thinking about mental health.”

According to Dr. Cryan, psychobiotics are interventions that target the microbiome for mental health purposes, including fermented foods, probiotics, prebiotics, synbiotics, parabiotics, and postbiotics.

Among these, probiotics have been a focal point of interventional research. Although results have been mixed, Dr. Cryan suggested that negative probiotic studies are more likely due to bacterial strain than a failure of the concept as a whole.

“Most strains of bacteria will do absolutely nothing,” Dr. Cryan said. “Strain is really important.”

In demonstration of this concept, he recounted a 2017 study conducted at University College Cork in which 22 healthy volunteers were given Bifidobacterium longum 1714, and then subjected to a social stress test. The results, published in Translational Psychiatry, showed that the probiotic, compared with placebo, was associated with attenuated stress responses, reduced daily stress, and enhanced visuospatial memory.

In contrast, a similar study by Dr. Cryan and colleagues, which tested Lactobacillus rhamnosus (JB-1), fell short.

“You [could not have gotten] more negative data into one paper if you tried,” Dr. Cryan said, referring to the study. “It did absolutely nothing.”

To find out which psychobiotics may have an impact, and how, Dr. Cryan called for more research.

“It’s still early days,” he said. “We probably have more meta-analyses and systematic reviews of the field than we have primary research papers.

Dr. Cryan concluded his presentation on an optimistic note.

“Neurology is waking up ... to understand that the microbiome could be playing a key role in many, many other disorders. ... Overall, what we’re beginning to see is that our state of gut markedly affects our state of mind.”

Dr. Cryan disclosed relationships with Abbott Nutrition, Roche Pharma, Nutricia, and others.

A growing body of research links the gut with the brain and behavior, but compartmentalization within the medical community may be slowing investigation of the gut-brain axis, according to a leading expert.

Studies have shown that the microbiome may influence a diverse range of behavioral and neurological processes, from acute and chronic stress responses to development of Parkinson’s and Alzheimer’s disease, reported John F. Cryan, PhD, of University College Cork, Ireland.

Dr. Cryan began his presentation at the annual Gut Microbiota for Health World Summit by citing Hippocrates, who is thought to have stated that all diseases begin in the gut.

“That can be quite strange when I talk to my neurology or psychiatry colleagues,” Dr. Cryan said. “They sometimes look at me like I have two heads. Because in medicine we compartmentalize, and if you are studying neurology or psychiatry or [you are] in clinical practice, you are focusing on everything from the neck upwards.”

For more than a decade, Dr. Cryan and colleagues have been investigating the gut-brain axis, predominantly in mouse models, but also across animal species and in humans.

At the meeting, sponsored by the American Gastroenterological Association and the European Society for Neurogastroenterology and Motility, Dr. Cryan reviewed a variety of representative studies.

For instance, in both mice and humans, research has shown that C-section, which is associated with poorer microbiome diversity than vaginal delivery, has also been linked with social deficits and elevated stress responses. And in the case of mice, coprophagia, in which cesarean-delivered mice eat the feces of vaginally born mice, has been shown to ameliorate these psychiatric effects.

Dr. Cryan likened this process to an “artificial fecal transplant.”

“You know, co-housing and eating each other’s poo is not the translational approach that we were advocating by any means,” Dr. Cryan said. “But at least it tells us – in a proof-of-concept way – that if we change the microbiome, then we can reverse what’s going on.”

While the mechanisms behind the gut-brain axis remain incompletely understood, Dr. Cryan noted that the vagus nerve, which travels from the gut to the brain, plays a central role, and that transecting this nerve in mice stops the microbiome from affecting the brain.

“What happens in vagus doesn’t just stay in vagus, but will actually affect our emotions in different ways,” Dr. Cryan said.

He emphasized that communication travels both ways along the gut-brain axis, and went on to describe how this phenomenon has been demonstrated across a wide array of animals.

“From insects all the way through to primates, if you start to interfere with social behavior, you change the microbiome,” Dr. Cryan said. “But the opposite is also true; if you start to change the microbiome you can start to have widespread effects on social behavior.”

In humans, manipulating the microbiome could open up new psychiatric frontiers, Dr. Cryan said.

“[In the past 30 years], there really have been no real advances in how we manage mental health,” he said. “That’s very sobering when we are having such a mental health problem across all ages right now. And so perhaps it’s time for what we’ve coined the ‘psychobiotic revolution’ – time for a new way of thinking about mental health.”

According to Dr. Cryan, psychobiotics are interventions that target the microbiome for mental health purposes, including fermented foods, probiotics, prebiotics, synbiotics, parabiotics, and postbiotics.

Among these, probiotics have been a focal point of interventional research. Although results have been mixed, Dr. Cryan suggested that negative probiotic studies are more likely due to bacterial strain than a failure of the concept as a whole.

“Most strains of bacteria will do absolutely nothing,” Dr. Cryan said. “Strain is really important.”

In demonstration of this concept, he recounted a 2017 study conducted at University College Cork in which 22 healthy volunteers were given Bifidobacterium longum 1714, and then subjected to a social stress test. The results, published in Translational Psychiatry, showed that the probiotic, compared with placebo, was associated with attenuated stress responses, reduced daily stress, and enhanced visuospatial memory.

In contrast, a similar study by Dr. Cryan and colleagues, which tested Lactobacillus rhamnosus (JB-1), fell short.

“You [could not have gotten] more negative data into one paper if you tried,” Dr. Cryan said, referring to the study. “It did absolutely nothing.”

To find out which psychobiotics may have an impact, and how, Dr. Cryan called for more research.

“It’s still early days,” he said. “We probably have more meta-analyses and systematic reviews of the field than we have primary research papers.

Dr. Cryan concluded his presentation on an optimistic note.

“Neurology is waking up ... to understand that the microbiome could be playing a key role in many, many other disorders. ... Overall, what we’re beginning to see is that our state of gut markedly affects our state of mind.”

Dr. Cryan disclosed relationships with Abbott Nutrition, Roche Pharma, Nutricia, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GMFH 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Microbiome profiling ready to take personalized medicine to next level

Article Type
Changed
Wed, 03/11/2020 - 09:41

 

Standards and technology are now available for microbiome profiling to take personalized medicine to the next level, but prospective trials are needed to realize this possibility, according to a leading expert.

The need for prospective microbiomics trials is increasing with the incidence of immune disorders, many of which have been linked with disturbances in the gut, reported Jöel Doré, PhD, at the annual Gut Microbiota for Health World Summit.

“In spite of considerable progress in medicine, together with hygiene, antibiotics, and vaccination developments, we are still seeing an increasing incidence – uncontrolled, that started over 60 years ago – of immune-mediated conditions,” said Dr. Doré, research director at the French National Research Institute for Agriculture, Food, and the Environment.

According to the World Health Organization, one out of four people will be affected by such a disorder in their lifetime, and the incidence rate of some conditions is accelerating faster than others, with geographical distributions that suggest environmental risk factors.

“The rate of incidence of autism in the U.S.A. is a quite scary exponential curve, where less than 1 birth per 5,000 was [affected] in the 1970s, where today it is 1 birth out of 50,” Dr. Doré said at the meeting sponsored by the American Gastroenterological Association and the European Society for Neurogastroenterology and Motility. “Prevention is an urgent need, and possibly if we do not manage to understand what’s going on, human longevity might be at stake.”

Multiple studies have shown that transferring microbiota from humans with immune disorders to healthy mice can induce clinical signs of immune disorders, he said. And between humans, fecal transplants from healthy donors have reduced symptoms in patients with conditions such as inflammatory bowel disease.

While these studies support the link between microbe-host relationships and immune function, most of the underlying mechanisms remain unknown, Dr. Doré said. He highlighted the fact that the complex network of interactions involved separates microbiomics research from conventional approaches to disease.

“I want to stress the fact that clinical trials [in the past] have been designed in the situation where infection was the problem, [but] infection is essentially a linear thing; one agent, one risk, one disease,” Dr. Doré said. “What we are dealing with – with the increasing incidence of immune disorders – is host-microbe interaction at the center of everything, and alteration of host-microbe leading to risk, which may lead to disease. But host-microbe interaction is under the control of a vast number of environmental aspects ... so tools to deal with innovation and translation in a totally different or systemic configuration have yet to be invented.”

According to Dr. Doré, to develop clinical applications, research procedures must first be standardized.

“To be of use for the clinician and general practitioner, microbiome profiling will have to rely on pipelines of standardized preanalytical and analytical procedures,” he said. “This starts from sample collection and shipment.”

For the past 5 years, Dr. Doré and colleagues have been working to standardize procedures with a number of organizations around the world, and progress has been made.

“Today we have very good standards for shotgun sequencing,” he said.

With standards solidifying, microbiomics may lead to new clinical strategies for a range of conditions, even beyond immune disorders, Dr. Doré said. He noted that, as a relatively simple measure, gene richness in the microbiome may be used as a health stratifier. Studies have shown that low gene count has been associated with more severe metabolic and inflammatory traits among overweight patients, a lack of response to low-calorie diets among overweight and obese patients, severity of related conditions and risk of mortality among patients with liver cirrhosis, and poorer responses to immunotherapy among patients with cancer.

Certain patterns of flora may be prognostic, Dr. Doré said, citing a study by Gopalakrishnan et al. that involved 112 melanoma patients, in which those with a high abundance of Faecalibacterium had significantly longer progression-free survival than patients with a low abundance of the same bacteria. Further, a multivariate model showed that a high abundance of Faecalibacterium was the strongest predictor of response to immunotherapy, (hazard ratio, 2.95; P = .03), followed closely by prior immunotherapy (HR, 2.87; P = .03). In contrast, patients with a high abundance of Bacteroidales had shorter progression-free survival than patients with a low abundance of the same bacteria.

Dr. Doré also referred to one of the first interventional microbiomics studies in oncology. Mohty et al. conducted the ODYSSEE phase 1b/2a trial involving 25 patients with acute myeloid leukemia, in which patients were given autologous fecal microbiota transplants after induction chemotherapy and antibiotics. The treatment recovered 90% of original microbiota, and the estimated 1-year overall survival rate was 84%, compared with a historical rate of 70%.

The ODYSSEE study serves as proof of concept that microbiomics may eventually offer the next level of personalized medicine, Dr. Doré said. And now, with standards and technology available, researchers can move forward.

Dr. Doré disclosed relationships with BioFortis, Janssen, Sanofi, and other pharmaceutical companies.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Standards and technology are now available for microbiome profiling to take personalized medicine to the next level, but prospective trials are needed to realize this possibility, according to a leading expert.

The need for prospective microbiomics trials is increasing with the incidence of immune disorders, many of which have been linked with disturbances in the gut, reported Jöel Doré, PhD, at the annual Gut Microbiota for Health World Summit.

“In spite of considerable progress in medicine, together with hygiene, antibiotics, and vaccination developments, we are still seeing an increasing incidence – uncontrolled, that started over 60 years ago – of immune-mediated conditions,” said Dr. Doré, research director at the French National Research Institute for Agriculture, Food, and the Environment.

According to the World Health Organization, one out of four people will be affected by such a disorder in their lifetime, and the incidence rate of some conditions is accelerating faster than others, with geographical distributions that suggest environmental risk factors.

“The rate of incidence of autism in the U.S.A. is a quite scary exponential curve, where less than 1 birth per 5,000 was [affected] in the 1970s, where today it is 1 birth out of 50,” Dr. Doré said at the meeting sponsored by the American Gastroenterological Association and the European Society for Neurogastroenterology and Motility. “Prevention is an urgent need, and possibly if we do not manage to understand what’s going on, human longevity might be at stake.”

Multiple studies have shown that transferring microbiota from humans with immune disorders to healthy mice can induce clinical signs of immune disorders, he said. And between humans, fecal transplants from healthy donors have reduced symptoms in patients with conditions such as inflammatory bowel disease.

While these studies support the link between microbe-host relationships and immune function, most of the underlying mechanisms remain unknown, Dr. Doré said. He highlighted the fact that the complex network of interactions involved separates microbiomics research from conventional approaches to disease.

“I want to stress the fact that clinical trials [in the past] have been designed in the situation where infection was the problem, [but] infection is essentially a linear thing; one agent, one risk, one disease,” Dr. Doré said. “What we are dealing with – with the increasing incidence of immune disorders – is host-microbe interaction at the center of everything, and alteration of host-microbe leading to risk, which may lead to disease. But host-microbe interaction is under the control of a vast number of environmental aspects ... so tools to deal with innovation and translation in a totally different or systemic configuration have yet to be invented.”

According to Dr. Doré, to develop clinical applications, research procedures must first be standardized.

“To be of use for the clinician and general practitioner, microbiome profiling will have to rely on pipelines of standardized preanalytical and analytical procedures,” he said. “This starts from sample collection and shipment.”

For the past 5 years, Dr. Doré and colleagues have been working to standardize procedures with a number of organizations around the world, and progress has been made.

“Today we have very good standards for shotgun sequencing,” he said.

With standards solidifying, microbiomics may lead to new clinical strategies for a range of conditions, even beyond immune disorders, Dr. Doré said. He noted that, as a relatively simple measure, gene richness in the microbiome may be used as a health stratifier. Studies have shown that low gene count has been associated with more severe metabolic and inflammatory traits among overweight patients, a lack of response to low-calorie diets among overweight and obese patients, severity of related conditions and risk of mortality among patients with liver cirrhosis, and poorer responses to immunotherapy among patients with cancer.

Certain patterns of flora may be prognostic, Dr. Doré said, citing a study by Gopalakrishnan et al. that involved 112 melanoma patients, in which those with a high abundance of Faecalibacterium had significantly longer progression-free survival than patients with a low abundance of the same bacteria. Further, a multivariate model showed that a high abundance of Faecalibacterium was the strongest predictor of response to immunotherapy, (hazard ratio, 2.95; P = .03), followed closely by prior immunotherapy (HR, 2.87; P = .03). In contrast, patients with a high abundance of Bacteroidales had shorter progression-free survival than patients with a low abundance of the same bacteria.

Dr. Doré also referred to one of the first interventional microbiomics studies in oncology. Mohty et al. conducted the ODYSSEE phase 1b/2a trial involving 25 patients with acute myeloid leukemia, in which patients were given autologous fecal microbiota transplants after induction chemotherapy and antibiotics. The treatment recovered 90% of original microbiota, and the estimated 1-year overall survival rate was 84%, compared with a historical rate of 70%.

The ODYSSEE study serves as proof of concept that microbiomics may eventually offer the next level of personalized medicine, Dr. Doré said. And now, with standards and technology available, researchers can move forward.

Dr. Doré disclosed relationships with BioFortis, Janssen, Sanofi, and other pharmaceutical companies.

 

Standards and technology are now available for microbiome profiling to take personalized medicine to the next level, but prospective trials are needed to realize this possibility, according to a leading expert.

The need for prospective microbiomics trials is increasing with the incidence of immune disorders, many of which have been linked with disturbances in the gut, reported Jöel Doré, PhD, at the annual Gut Microbiota for Health World Summit.

“In spite of considerable progress in medicine, together with hygiene, antibiotics, and vaccination developments, we are still seeing an increasing incidence – uncontrolled, that started over 60 years ago – of immune-mediated conditions,” said Dr. Doré, research director at the French National Research Institute for Agriculture, Food, and the Environment.

According to the World Health Organization, one out of four people will be affected by such a disorder in their lifetime, and the incidence rate of some conditions is accelerating faster than others, with geographical distributions that suggest environmental risk factors.

“The rate of incidence of autism in the U.S.A. is a quite scary exponential curve, where less than 1 birth per 5,000 was [affected] in the 1970s, where today it is 1 birth out of 50,” Dr. Doré said at the meeting sponsored by the American Gastroenterological Association and the European Society for Neurogastroenterology and Motility. “Prevention is an urgent need, and possibly if we do not manage to understand what’s going on, human longevity might be at stake.”

Multiple studies have shown that transferring microbiota from humans with immune disorders to healthy mice can induce clinical signs of immune disorders, he said. And between humans, fecal transplants from healthy donors have reduced symptoms in patients with conditions such as inflammatory bowel disease.

While these studies support the link between microbe-host relationships and immune function, most of the underlying mechanisms remain unknown, Dr. Doré said. He highlighted the fact that the complex network of interactions involved separates microbiomics research from conventional approaches to disease.

“I want to stress the fact that clinical trials [in the past] have been designed in the situation where infection was the problem, [but] infection is essentially a linear thing; one agent, one risk, one disease,” Dr. Doré said. “What we are dealing with – with the increasing incidence of immune disorders – is host-microbe interaction at the center of everything, and alteration of host-microbe leading to risk, which may lead to disease. But host-microbe interaction is under the control of a vast number of environmental aspects ... so tools to deal with innovation and translation in a totally different or systemic configuration have yet to be invented.”

According to Dr. Doré, to develop clinical applications, research procedures must first be standardized.

“To be of use for the clinician and general practitioner, microbiome profiling will have to rely on pipelines of standardized preanalytical and analytical procedures,” he said. “This starts from sample collection and shipment.”

For the past 5 years, Dr. Doré and colleagues have been working to standardize procedures with a number of organizations around the world, and progress has been made.

“Today we have very good standards for shotgun sequencing,” he said.

With standards solidifying, microbiomics may lead to new clinical strategies for a range of conditions, even beyond immune disorders, Dr. Doré said. He noted that, as a relatively simple measure, gene richness in the microbiome may be used as a health stratifier. Studies have shown that low gene count has been associated with more severe metabolic and inflammatory traits among overweight patients, a lack of response to low-calorie diets among overweight and obese patients, severity of related conditions and risk of mortality among patients with liver cirrhosis, and poorer responses to immunotherapy among patients with cancer.

Certain patterns of flora may be prognostic, Dr. Doré said, citing a study by Gopalakrishnan et al. that involved 112 melanoma patients, in which those with a high abundance of Faecalibacterium had significantly longer progression-free survival than patients with a low abundance of the same bacteria. Further, a multivariate model showed that a high abundance of Faecalibacterium was the strongest predictor of response to immunotherapy, (hazard ratio, 2.95; P = .03), followed closely by prior immunotherapy (HR, 2.87; P = .03). In contrast, patients with a high abundance of Bacteroidales had shorter progression-free survival than patients with a low abundance of the same bacteria.

Dr. Doré also referred to one of the first interventional microbiomics studies in oncology. Mohty et al. conducted the ODYSSEE phase 1b/2a trial involving 25 patients with acute myeloid leukemia, in which patients were given autologous fecal microbiota transplants after induction chemotherapy and antibiotics. The treatment recovered 90% of original microbiota, and the estimated 1-year overall survival rate was 84%, compared with a historical rate of 70%.

The ODYSSEE study serves as proof of concept that microbiomics may eventually offer the next level of personalized medicine, Dr. Doré said. And now, with standards and technology available, researchers can move forward.

Dr. Doré disclosed relationships with BioFortis, Janssen, Sanofi, and other pharmaceutical companies.

Publications
Publications
Topics
Article Type
Sections
Article Source

EXPERT ANALYSIS FROM GMFH 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Molecularly guided therapy in pancreatic cancer: Untapped potential and the ‘bright future’ ahead

Article Type
Changed
Wed, 05/26/2021 - 13:45

Molecularly guided treatments may extend survival by more than a year for patients with pancreatic cancer who have actionable molecular alterations, according to a retrospective analysis of almost 2,000 patients in the Know Your Tumor registry.

While patients with actionable alterations remain in the minority, experts suggest the study’s results provide a ray of hope for treating a cancer that has historically been associated with a poor prognosis and disappointing clinical trials.

Patients with actionable molecular alterations who received matched therapies had a median overall survival of 2.58 years, compared with 1.51 years for those who received unmatched therapies, reported lead author Michael J. Pishvaian, MD, PhD, of MD Anderson Cancer Center in Houston, and colleagues.

“Our study provides strong rationale that tumor-based molecular profiling for patients with pancreatic cancer should be routinely performed and encourages prospective clinical trials based on this or similar platforms,” the investigators wrote in Lancet Oncology.

In an accompanying comment, Jörg Kleeff, MD, and Christoph W. Michalski, MD, of Martin-Luther University Halle-Wittenberg in Germany, supported this conclusion, calling the study “an encouraging starting point for a structured investigation of molecularly matched therapies.”

The authors also highlighted the untapped potential the trial uncovered, noting that only 4% of patients received a molecularly matched therapy, even though one-quarter had actionable alterations.

“These findings are important in that they define an estimation of the current number of potentially actionable targets and in that they provide a – rather disappointing – real-world assessment of the number of patients who actually received molecularly targeted treatment,” Dr. Kleeff and Dr. Michalski wrote.

They went on to describe a list of unanswered questions in the field, ranging from ethical dilemmas that may be encountered when choosing between targeted trials and chemotherapy for patients with targetable alterations, to more tangible subjects, such as genome sequencing techniques and therapeutic timing.

Their comment and the related study were published simultaneously with a series of pancreatic cancer articles in Lancet journals, which includes:

According to the authors of the therapeutic review, treatments for pancreatic cancer have “a bright future.”

“There is more optimism now than ever before that advances will be made by combining chemotherapy more effectively with agents that target the unique features of pancreatic ductal adenocarcinoma tumors,” the authors wrote. “The next 5-10 years should deliver major improvements in outcomes through the use of novel agents that specifically target pathological signaling pathways and genetic alterations.”

In an interview, Dana B. Cardin, MD, of the Vanderbilt-Ingram Cancer Center in Nashville, Tenn., shared this favorable outlook, which she said is particularly needed for a condition that has generally been left behind by the new era of personalized oncology treatments.

Dr. Dana B. Cardin

“There’s been a lot of frustration on the part of patients and doctors and everyone in the research community that there have been a lot of other tumor types [in which] learning about genetic changes in cancer cells has really revolutionized how patients are being treated,” Dr. Cardin said. “That is something that has really been elusive in pancreas cancer.”

The retrospective study by Dr. Pishvaian and colleagues serves as proof-of-concept by showing that large-scale genomic testing can also identify personalized treatments for patients with pancreatic cancer, Dr. Cardin said.

“When you do find them, even when it’s a small percentage of patients that may have actionable mutations, it really can make a huge difference in the outcomes for those patients,” she said. “We have to get rid of this sense of futility. If you’re not trying to look for those things, then you’re not ever going to find them.”

Regardless of whether a personalized treatment is available for a particular patient, Dr. Cardin emphasized the importance of a positive and active clinical mindset, as data suggest that existing supportive strategies can have a significant impact on patient health.

“We can make a difference for these patients,” Dr. Cardin said, “but we’re only going to make a difference if we try.”

Dr. Cardin, a National Comprehensive Cancer Network panelist for pancreatic cancer, went on to explain how outcomes in the control arm of pancreatic cancer clinical trials have been improving over the past decade, even though the standard control drug, gemcitabine, has stayed the same.

“It doesn’t mean that gemcitabine is better than it used to be,” Dr. Cardin said. “It probably means that we’re treating more patients, and we’re also doing a better job of supporting those patients.” She identified growth factors, nutritional support, and enzyme supplements as key ancillary treatments for those who need them.

Dr. Pishvaian and colleagues’ study was funded by Pancreatic Cancer Action Network and Perthera. The investigators disclosed relationships with Perthera and other companies. Dr. Kleeff, Dr. Michalski, and Dr. Cardin declared no conflicts of interest.

SOURCES: Pishvaian MJ et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(20)30074-7; Kleeff J et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(20)30148-0; Christenson ES et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(19)30795-8.

Publications
Topics
Sections

Molecularly guided treatments may extend survival by more than a year for patients with pancreatic cancer who have actionable molecular alterations, according to a retrospective analysis of almost 2,000 patients in the Know Your Tumor registry.

While patients with actionable alterations remain in the minority, experts suggest the study’s results provide a ray of hope for treating a cancer that has historically been associated with a poor prognosis and disappointing clinical trials.

Patients with actionable molecular alterations who received matched therapies had a median overall survival of 2.58 years, compared with 1.51 years for those who received unmatched therapies, reported lead author Michael J. Pishvaian, MD, PhD, of MD Anderson Cancer Center in Houston, and colleagues.

“Our study provides strong rationale that tumor-based molecular profiling for patients with pancreatic cancer should be routinely performed and encourages prospective clinical trials based on this or similar platforms,” the investigators wrote in Lancet Oncology.

In an accompanying comment, Jörg Kleeff, MD, and Christoph W. Michalski, MD, of Martin-Luther University Halle-Wittenberg in Germany, supported this conclusion, calling the study “an encouraging starting point for a structured investigation of molecularly matched therapies.”

The authors also highlighted the untapped potential the trial uncovered, noting that only 4% of patients received a molecularly matched therapy, even though one-quarter had actionable alterations.

“These findings are important in that they define an estimation of the current number of potentially actionable targets and in that they provide a – rather disappointing – real-world assessment of the number of patients who actually received molecularly targeted treatment,” Dr. Kleeff and Dr. Michalski wrote.

They went on to describe a list of unanswered questions in the field, ranging from ethical dilemmas that may be encountered when choosing between targeted trials and chemotherapy for patients with targetable alterations, to more tangible subjects, such as genome sequencing techniques and therapeutic timing.

Their comment and the related study were published simultaneously with a series of pancreatic cancer articles in Lancet journals, which includes:

According to the authors of the therapeutic review, treatments for pancreatic cancer have “a bright future.”

“There is more optimism now than ever before that advances will be made by combining chemotherapy more effectively with agents that target the unique features of pancreatic ductal adenocarcinoma tumors,” the authors wrote. “The next 5-10 years should deliver major improvements in outcomes through the use of novel agents that specifically target pathological signaling pathways and genetic alterations.”

In an interview, Dana B. Cardin, MD, of the Vanderbilt-Ingram Cancer Center in Nashville, Tenn., shared this favorable outlook, which she said is particularly needed for a condition that has generally been left behind by the new era of personalized oncology treatments.

Dr. Dana B. Cardin

“There’s been a lot of frustration on the part of patients and doctors and everyone in the research community that there have been a lot of other tumor types [in which] learning about genetic changes in cancer cells has really revolutionized how patients are being treated,” Dr. Cardin said. “That is something that has really been elusive in pancreas cancer.”

The retrospective study by Dr. Pishvaian and colleagues serves as proof-of-concept by showing that large-scale genomic testing can also identify personalized treatments for patients with pancreatic cancer, Dr. Cardin said.

“When you do find them, even when it’s a small percentage of patients that may have actionable mutations, it really can make a huge difference in the outcomes for those patients,” she said. “We have to get rid of this sense of futility. If you’re not trying to look for those things, then you’re not ever going to find them.”

Regardless of whether a personalized treatment is available for a particular patient, Dr. Cardin emphasized the importance of a positive and active clinical mindset, as data suggest that existing supportive strategies can have a significant impact on patient health.

“We can make a difference for these patients,” Dr. Cardin said, “but we’re only going to make a difference if we try.”

Dr. Cardin, a National Comprehensive Cancer Network panelist for pancreatic cancer, went on to explain how outcomes in the control arm of pancreatic cancer clinical trials have been improving over the past decade, even though the standard control drug, gemcitabine, has stayed the same.

“It doesn’t mean that gemcitabine is better than it used to be,” Dr. Cardin said. “It probably means that we’re treating more patients, and we’re also doing a better job of supporting those patients.” She identified growth factors, nutritional support, and enzyme supplements as key ancillary treatments for those who need them.

Dr. Pishvaian and colleagues’ study was funded by Pancreatic Cancer Action Network and Perthera. The investigators disclosed relationships with Perthera and other companies. Dr. Kleeff, Dr. Michalski, and Dr. Cardin declared no conflicts of interest.

SOURCES: Pishvaian MJ et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(20)30074-7; Kleeff J et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(20)30148-0; Christenson ES et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(19)30795-8.

Molecularly guided treatments may extend survival by more than a year for patients with pancreatic cancer who have actionable molecular alterations, according to a retrospective analysis of almost 2,000 patients in the Know Your Tumor registry.

While patients with actionable alterations remain in the minority, experts suggest the study’s results provide a ray of hope for treating a cancer that has historically been associated with a poor prognosis and disappointing clinical trials.

Patients with actionable molecular alterations who received matched therapies had a median overall survival of 2.58 years, compared with 1.51 years for those who received unmatched therapies, reported lead author Michael J. Pishvaian, MD, PhD, of MD Anderson Cancer Center in Houston, and colleagues.

“Our study provides strong rationale that tumor-based molecular profiling for patients with pancreatic cancer should be routinely performed and encourages prospective clinical trials based on this or similar platforms,” the investigators wrote in Lancet Oncology.

In an accompanying comment, Jörg Kleeff, MD, and Christoph W. Michalski, MD, of Martin-Luther University Halle-Wittenberg in Germany, supported this conclusion, calling the study “an encouraging starting point for a structured investigation of molecularly matched therapies.”

The authors also highlighted the untapped potential the trial uncovered, noting that only 4% of patients received a molecularly matched therapy, even though one-quarter had actionable alterations.

“These findings are important in that they define an estimation of the current number of potentially actionable targets and in that they provide a – rather disappointing – real-world assessment of the number of patients who actually received molecularly targeted treatment,” Dr. Kleeff and Dr. Michalski wrote.

They went on to describe a list of unanswered questions in the field, ranging from ethical dilemmas that may be encountered when choosing between targeted trials and chemotherapy for patients with targetable alterations, to more tangible subjects, such as genome sequencing techniques and therapeutic timing.

Their comment and the related study were published simultaneously with a series of pancreatic cancer articles in Lancet journals, which includes:

According to the authors of the therapeutic review, treatments for pancreatic cancer have “a bright future.”

“There is more optimism now than ever before that advances will be made by combining chemotherapy more effectively with agents that target the unique features of pancreatic ductal adenocarcinoma tumors,” the authors wrote. “The next 5-10 years should deliver major improvements in outcomes through the use of novel agents that specifically target pathological signaling pathways and genetic alterations.”

In an interview, Dana B. Cardin, MD, of the Vanderbilt-Ingram Cancer Center in Nashville, Tenn., shared this favorable outlook, which she said is particularly needed for a condition that has generally been left behind by the new era of personalized oncology treatments.

Dr. Dana B. Cardin

“There’s been a lot of frustration on the part of patients and doctors and everyone in the research community that there have been a lot of other tumor types [in which] learning about genetic changes in cancer cells has really revolutionized how patients are being treated,” Dr. Cardin said. “That is something that has really been elusive in pancreas cancer.”

The retrospective study by Dr. Pishvaian and colleagues serves as proof-of-concept by showing that large-scale genomic testing can also identify personalized treatments for patients with pancreatic cancer, Dr. Cardin said.

“When you do find them, even when it’s a small percentage of patients that may have actionable mutations, it really can make a huge difference in the outcomes for those patients,” she said. “We have to get rid of this sense of futility. If you’re not trying to look for those things, then you’re not ever going to find them.”

Regardless of whether a personalized treatment is available for a particular patient, Dr. Cardin emphasized the importance of a positive and active clinical mindset, as data suggest that existing supportive strategies can have a significant impact on patient health.

“We can make a difference for these patients,” Dr. Cardin said, “but we’re only going to make a difference if we try.”

Dr. Cardin, a National Comprehensive Cancer Network panelist for pancreatic cancer, went on to explain how outcomes in the control arm of pancreatic cancer clinical trials have been improving over the past decade, even though the standard control drug, gemcitabine, has stayed the same.

“It doesn’t mean that gemcitabine is better than it used to be,” Dr. Cardin said. “It probably means that we’re treating more patients, and we’re also doing a better job of supporting those patients.” She identified growth factors, nutritional support, and enzyme supplements as key ancillary treatments for those who need them.

Dr. Pishvaian and colleagues’ study was funded by Pancreatic Cancer Action Network and Perthera. The investigators disclosed relationships with Perthera and other companies. Dr. Kleeff, Dr. Michalski, and Dr. Cardin declared no conflicts of interest.

SOURCES: Pishvaian MJ et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(20)30074-7; Kleeff J et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(20)30148-0; Christenson ES et al. Lancet Oncol. 2020 Mar 2. doi: 10.1016/S1470-2045(19)30795-8.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM LANCET ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Best definition of malnutrition varies by cancer type

Article Type
Changed
Wed, 05/26/2021 - 13:45

For patients undergoing major oncologic surgery, the best definition of malnutrition used to assess postoperative risk varies by cancer type, results of a retrospective study suggest.

Dr. Nicholas P. McKenna

The current, one-size-fits-all approach to nutritional status leads to both undertreatment and overtreatment of malnutrition, as well as inaccurate estimations of postoperative risk, reported lead study author Nicholas P. McKenna, MD, of the Mayo Clinic in Rochester, Minn., and colleagues.

“Assessing nutritional status is important because it impacts preoperative planning, particularly with respect to the use of prehabilitation,” the investigators wrote. Their report is in the Journal of the American College of Surgeons. They noted that while prehabilitation has been shown to reduce postoperative risk among those who need it, identification of these patients is an area that needs improvement.

With this in mind, Dr. McKenna and colleagues analyzed 205,840 major oncologic operations, with data drawn from the American College of Surgeons National Surgical Quality Improvement (NSQIP) database.

The researchers evaluated patients’ nutritional status using three techniques: the NSQIP method, the European Society for Clinical Nutrition and Metabolism (ESPEN) definitions, and the World Health Organization body mass index (BMI) classification system.

Combining these three assessments led to seven hierarchical nutritional status categories:

  • Severe malnutrition – BMI less than 18.5 kg/m2 and greater than 10% weight loss
  • ESPEN 1 – BMI 18.5-20 kg/m2 (if younger than 70 years) or less than 22 kg/m2 (if 70 years or older) plus greater than 10% weight loss
  • ESPEN 2 – BMI less than 18.5 kg/m2
  • NSQIP – BMI greater than 20 kg/m2 (if younger than 70 years) or 22 kg/m2 (if 70 years or older) plus greater than 10% weight loss
  • Mild malnutrition – BMI 18.5-20 kg/m2 (if younger than 70 years) or less than 22 kg/m2 (if 70 years or older)
  • Obese – BMI at least 30 kg/m2
  • No malnutrition.

The study’s primary outcomes were 30-day mortality and 30-day morbidity. The latter included a variety of complications, such as deep incisional surgical site infection, septic shock, and acute renal failure. Demographic and clinical factors were included in multivariate analyses.
 

Results

Most of the operations involved patients with colorectal cancer (74%), followed by pancreatic (10%), lung (9%), gastric (3%), esophageal (3%), and liver (2%) cancer.

Across all patients, 16% fell into one of five malnutrition categories: mild malnutrition (6%), NSQIP (6%), ESPEN 2 (2%), ESPEN 1 (1%), or severe malnutrition (0.6%). The remainder of patients were either obese (31%) or had normal nutritional status (54%).

Malnutrition was most common among patients with pancreatic cancer (28%) and least common among those with colorectal cancer (14%).

Aligning with previous research, this study showed that nutritional status was associated with postoperative risk. Mortality risk was highest among patients with severe malnutrition, and morbidity was most common in the severe and ESPEN 1 groups (P less than .0001 for both).

While the spectrum of classifications appeared accurate across the population, multivariable models for mortality and morbidity revealed an interaction between cancer type and malnutrition definition (P less than .0001 for both), which suggested the most accurate definition of malnutrition differed from one type of cancer to another.

Specifically, a classification of severe malnutrition was most predictive of mortality among patients with esophageal or colorectal cancer. ESPEN 1 was most predictive of mortality for patients with gastric or lung cancer, and NSQIP was most predictive for those with liver cancer.

For predicting morbidity, severe malnutrition was most accurate among patients with colorectal cancer, whereas ESPEN 1 was better suited for gastric and lung cancer.
 

 

 

Interpreting and applying the results

“The biggest takeaway is that the optimal definition of malnutrition varies by cancer type,” Dr. McKenna said in an interview.

He went on to explain that weight loss is a particularly important indicator of malnutrition for patients with esophageal or gastric cancer. “These are the cancers that more commonly undergo neoadjuvant chemotherapy,” he noted.

The other major finding, Dr. McKenna said, offers some perspective on short-term versus long-term risk.

“Most people consider obesity a negative prognostic factor,” he said. “But in terms of operative risk, it’s kind of a neutral effect. It doesn’t really affect the short-term outcomes of an operation.”

Still, Dr. McKenna warned that a visual assessment of patient body condition is not enough to predict postoperative risk. Instead, he recommended accurate height and weight measurements during annual and preoperative exams. He also noted that more patients are at risk than clinicians may suspect.

“Even definitions that didn’t previously exist, such as mild malnutrition, had a somewhat negative effect within colorectal cancer and esophageal cancer,” Dr. McKenna said. “So these are patients who previously probably would be considered pretty healthy, but there is probably some room to improve their nutritional status.”

While the study revealed that different types of cancer should have unique tools for measuring nutritional status, development of these systems will require more research concerning prehabilitation outcomes, according to Dr. McKenna. In the meantime, he highlighted a point of action in the clinic.

“We think, overall, especially with the rise of neoadjuvant chemotherapy upfront, before surgery, that identifying patients at risk before they start neoadjuvant chemotherapy is going to be important,” he said. “They are the ones who really need to be targeted.”

There was no external funding for this study, and the investigators reported no conflicts of interest.
 

SOURCE: McKenna NP et al. J Am Coll Surg. 2020 Feb 26. doi: 10.1016/j.jamcollsurg.2019.12.034.

Publications
Topics
Sections

For patients undergoing major oncologic surgery, the best definition of malnutrition used to assess postoperative risk varies by cancer type, results of a retrospective study suggest.

Dr. Nicholas P. McKenna

The current, one-size-fits-all approach to nutritional status leads to both undertreatment and overtreatment of malnutrition, as well as inaccurate estimations of postoperative risk, reported lead study author Nicholas P. McKenna, MD, of the Mayo Clinic in Rochester, Minn., and colleagues.

“Assessing nutritional status is important because it impacts preoperative planning, particularly with respect to the use of prehabilitation,” the investigators wrote. Their report is in the Journal of the American College of Surgeons. They noted that while prehabilitation has been shown to reduce postoperative risk among those who need it, identification of these patients is an area that needs improvement.

With this in mind, Dr. McKenna and colleagues analyzed 205,840 major oncologic operations, with data drawn from the American College of Surgeons National Surgical Quality Improvement (NSQIP) database.

The researchers evaluated patients’ nutritional status using three techniques: the NSQIP method, the European Society for Clinical Nutrition and Metabolism (ESPEN) definitions, and the World Health Organization body mass index (BMI) classification system.

Combining these three assessments led to seven hierarchical nutritional status categories:

  • Severe malnutrition – BMI less than 18.5 kg/m2 and greater than 10% weight loss
  • ESPEN 1 – BMI 18.5-20 kg/m2 (if younger than 70 years) or less than 22 kg/m2 (if 70 years or older) plus greater than 10% weight loss
  • ESPEN 2 – BMI less than 18.5 kg/m2
  • NSQIP – BMI greater than 20 kg/m2 (if younger than 70 years) or 22 kg/m2 (if 70 years or older) plus greater than 10% weight loss
  • Mild malnutrition – BMI 18.5-20 kg/m2 (if younger than 70 years) or less than 22 kg/m2 (if 70 years or older)
  • Obese – BMI at least 30 kg/m2
  • No malnutrition.

The study’s primary outcomes were 30-day mortality and 30-day morbidity. The latter included a variety of complications, such as deep incisional surgical site infection, septic shock, and acute renal failure. Demographic and clinical factors were included in multivariate analyses.
 

Results

Most of the operations involved patients with colorectal cancer (74%), followed by pancreatic (10%), lung (9%), gastric (3%), esophageal (3%), and liver (2%) cancer.

Across all patients, 16% fell into one of five malnutrition categories: mild malnutrition (6%), NSQIP (6%), ESPEN 2 (2%), ESPEN 1 (1%), or severe malnutrition (0.6%). The remainder of patients were either obese (31%) or had normal nutritional status (54%).

Malnutrition was most common among patients with pancreatic cancer (28%) and least common among those with colorectal cancer (14%).

Aligning with previous research, this study showed that nutritional status was associated with postoperative risk. Mortality risk was highest among patients with severe malnutrition, and morbidity was most common in the severe and ESPEN 1 groups (P less than .0001 for both).

While the spectrum of classifications appeared accurate across the population, multivariable models for mortality and morbidity revealed an interaction between cancer type and malnutrition definition (P less than .0001 for both), which suggested the most accurate definition of malnutrition differed from one type of cancer to another.

Specifically, a classification of severe malnutrition was most predictive of mortality among patients with esophageal or colorectal cancer. ESPEN 1 was most predictive of mortality for patients with gastric or lung cancer, and NSQIP was most predictive for those with liver cancer.

For predicting morbidity, severe malnutrition was most accurate among patients with colorectal cancer, whereas ESPEN 1 was better suited for gastric and lung cancer.
 

 

 

Interpreting and applying the results

“The biggest takeaway is that the optimal definition of malnutrition varies by cancer type,” Dr. McKenna said in an interview.

He went on to explain that weight loss is a particularly important indicator of malnutrition for patients with esophageal or gastric cancer. “These are the cancers that more commonly undergo neoadjuvant chemotherapy,” he noted.

The other major finding, Dr. McKenna said, offers some perspective on short-term versus long-term risk.

“Most people consider obesity a negative prognostic factor,” he said. “But in terms of operative risk, it’s kind of a neutral effect. It doesn’t really affect the short-term outcomes of an operation.”

Still, Dr. McKenna warned that a visual assessment of patient body condition is not enough to predict postoperative risk. Instead, he recommended accurate height and weight measurements during annual and preoperative exams. He also noted that more patients are at risk than clinicians may suspect.

“Even definitions that didn’t previously exist, such as mild malnutrition, had a somewhat negative effect within colorectal cancer and esophageal cancer,” Dr. McKenna said. “So these are patients who previously probably would be considered pretty healthy, but there is probably some room to improve their nutritional status.”

While the study revealed that different types of cancer should have unique tools for measuring nutritional status, development of these systems will require more research concerning prehabilitation outcomes, according to Dr. McKenna. In the meantime, he highlighted a point of action in the clinic.

“We think, overall, especially with the rise of neoadjuvant chemotherapy upfront, before surgery, that identifying patients at risk before they start neoadjuvant chemotherapy is going to be important,” he said. “They are the ones who really need to be targeted.”

There was no external funding for this study, and the investigators reported no conflicts of interest.
 

SOURCE: McKenna NP et al. J Am Coll Surg. 2020 Feb 26. doi: 10.1016/j.jamcollsurg.2019.12.034.

For patients undergoing major oncologic surgery, the best definition of malnutrition used to assess postoperative risk varies by cancer type, results of a retrospective study suggest.

Dr. Nicholas P. McKenna

The current, one-size-fits-all approach to nutritional status leads to both undertreatment and overtreatment of malnutrition, as well as inaccurate estimations of postoperative risk, reported lead study author Nicholas P. McKenna, MD, of the Mayo Clinic in Rochester, Minn., and colleagues.

“Assessing nutritional status is important because it impacts preoperative planning, particularly with respect to the use of prehabilitation,” the investigators wrote. Their report is in the Journal of the American College of Surgeons. They noted that while prehabilitation has been shown to reduce postoperative risk among those who need it, identification of these patients is an area that needs improvement.

With this in mind, Dr. McKenna and colleagues analyzed 205,840 major oncologic operations, with data drawn from the American College of Surgeons National Surgical Quality Improvement (NSQIP) database.

The researchers evaluated patients’ nutritional status using three techniques: the NSQIP method, the European Society for Clinical Nutrition and Metabolism (ESPEN) definitions, and the World Health Organization body mass index (BMI) classification system.

Combining these three assessments led to seven hierarchical nutritional status categories:

  • Severe malnutrition – BMI less than 18.5 kg/m2 and greater than 10% weight loss
  • ESPEN 1 – BMI 18.5-20 kg/m2 (if younger than 70 years) or less than 22 kg/m2 (if 70 years or older) plus greater than 10% weight loss
  • ESPEN 2 – BMI less than 18.5 kg/m2
  • NSQIP – BMI greater than 20 kg/m2 (if younger than 70 years) or 22 kg/m2 (if 70 years or older) plus greater than 10% weight loss
  • Mild malnutrition – BMI 18.5-20 kg/m2 (if younger than 70 years) or less than 22 kg/m2 (if 70 years or older)
  • Obese – BMI at least 30 kg/m2
  • No malnutrition.

The study’s primary outcomes were 30-day mortality and 30-day morbidity. The latter included a variety of complications, such as deep incisional surgical site infection, septic shock, and acute renal failure. Demographic and clinical factors were included in multivariate analyses.
 

Results

Most of the operations involved patients with colorectal cancer (74%), followed by pancreatic (10%), lung (9%), gastric (3%), esophageal (3%), and liver (2%) cancer.

Across all patients, 16% fell into one of five malnutrition categories: mild malnutrition (6%), NSQIP (6%), ESPEN 2 (2%), ESPEN 1 (1%), or severe malnutrition (0.6%). The remainder of patients were either obese (31%) or had normal nutritional status (54%).

Malnutrition was most common among patients with pancreatic cancer (28%) and least common among those with colorectal cancer (14%).

Aligning with previous research, this study showed that nutritional status was associated with postoperative risk. Mortality risk was highest among patients with severe malnutrition, and morbidity was most common in the severe and ESPEN 1 groups (P less than .0001 for both).

While the spectrum of classifications appeared accurate across the population, multivariable models for mortality and morbidity revealed an interaction between cancer type and malnutrition definition (P less than .0001 for both), which suggested the most accurate definition of malnutrition differed from one type of cancer to another.

Specifically, a classification of severe malnutrition was most predictive of mortality among patients with esophageal or colorectal cancer. ESPEN 1 was most predictive of mortality for patients with gastric or lung cancer, and NSQIP was most predictive for those with liver cancer.

For predicting morbidity, severe malnutrition was most accurate among patients with colorectal cancer, whereas ESPEN 1 was better suited for gastric and lung cancer.
 

 

 

Interpreting and applying the results

“The biggest takeaway is that the optimal definition of malnutrition varies by cancer type,” Dr. McKenna said in an interview.

He went on to explain that weight loss is a particularly important indicator of malnutrition for patients with esophageal or gastric cancer. “These are the cancers that more commonly undergo neoadjuvant chemotherapy,” he noted.

The other major finding, Dr. McKenna said, offers some perspective on short-term versus long-term risk.

“Most people consider obesity a negative prognostic factor,” he said. “But in terms of operative risk, it’s kind of a neutral effect. It doesn’t really affect the short-term outcomes of an operation.”

Still, Dr. McKenna warned that a visual assessment of patient body condition is not enough to predict postoperative risk. Instead, he recommended accurate height and weight measurements during annual and preoperative exams. He also noted that more patients are at risk than clinicians may suspect.

“Even definitions that didn’t previously exist, such as mild malnutrition, had a somewhat negative effect within colorectal cancer and esophageal cancer,” Dr. McKenna said. “So these are patients who previously probably would be considered pretty healthy, but there is probably some room to improve their nutritional status.”

While the study revealed that different types of cancer should have unique tools for measuring nutritional status, development of these systems will require more research concerning prehabilitation outcomes, according to Dr. McKenna. In the meantime, he highlighted a point of action in the clinic.

“We think, overall, especially with the rise of neoadjuvant chemotherapy upfront, before surgery, that identifying patients at risk before they start neoadjuvant chemotherapy is going to be important,” he said. “They are the ones who really need to be targeted.”

There was no external funding for this study, and the investigators reported no conflicts of interest.
 

SOURCE: McKenna NP et al. J Am Coll Surg. 2020 Feb 26. doi: 10.1016/j.jamcollsurg.2019.12.034.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM THE JOURNAL OF THE AMERICAN COLLEGE OF SURGEONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
218579
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap

Intensive AT/RT regimen sets new efficacy benchmark

Article Type
Changed
Thu, 03/05/2020 - 14:22

 

Intensive postoperative chemotherapy and focal radiation may improve event-free survival (EFS) among patients with atypical teratoid/rhabdoid tumors (AT/RT), results of the phase 3 ACNS0333 trial suggest.

Compared with historical therapies, the treatment protocol reduced the risk of EFS events by 57%, reported lead author Alyssa T. Reddy, MD, of the University of California San Francisco Benioff Children’s Hospital, and colleagues, who also noted that this is the first AT/RT-specific cooperative group trial.

“Case series and retrospective data suggested high-dose chemotherapy with peripheral blood stem cell (PBSC) rescue, early radiation therapy, and methotrexate had activity against AT/RT,” the investigators wrote in the Journal of Clinical Oncology.

Based on these findings, the investigators designed the ACNS0333 treatment protocol. Following surgery, all patients received two cycles of induction with methotrexate, vincristine, etoposide, cyclophosphamide, and cisplatin. They then underwent PBSC harvest.

Next, patients were divided into two groups based on age, disease location, and extent. Conformal radiotherapy was given between induction and consolidation to patients who were at least 6 months of age with tumor localized to the infratentorial brain or at least 12 months of age with tumor localized to the supratentorial brain. For younger patients or those with metastatic disease, radiotherapy was administered after consolidation was complete. Consolidation consisted of three cycles of thiotepa and carboplatin with PBSC support.

In addition to efficacy and safety analyses, molecular testing was performed on frozen tumor tissue and blood. This enabled a retrospective exploratory analysis of methylation profiles, which were used to subtype disease into three molecular classes: 2A/TYR, 2B/MYC, or 1/SHH-NOTCH.
 

Patient characteristics and results

The evaluable cohort included 65 patients, most of whom (83%) were younger than 36 months of age at baseline. About half of patients (51%) had infratentorial tumors, slightly fewer (40%) had supratentorial tumors, and 7.5% had a contiguous tumor in both locations. About one-third of patients (37%) had metastatic disease, and almost two-thirds (62%) had residual disease after surgery.

The median follow-up was 4.7 years. At 4 years, patients had an EFS rate of 35%. For patients aged 36 months or older, the 4-year EFS rate was higher still, at 48%.

These EFS rates compared favorably with the 6.4% EFS rate observed in a historical cohort of patients from the CCG-9921 trial (J Clin Oncol. 2005 Oct 20;23[30]:7621-31) and the POG-9233/4 trial (Neuro Oncol. 2014 Mar;16[3]:457-65). Overall, there was a 57% reduction in risk of EFS events in the ACNS0333 cohort compared with the historical controls (hazard ratio, 0.43; P less than .0005).

The 4-year overall survival rate was 43% for the entire ACNS0333 cohort and 57% for patients aged 36 months or older. Looking at molecular subtypes, patients with 1/SHH-NOTCH tumors had the best 4-year overall survival rate, at 56%, compared with 41% for 2A/TYR and 27% for 2B/MYC.

Adverse events were predominantly hematologic or infectious events reported during the induction and consolidation phases. Grade 4 or higher adverse events that occurred in at least 5% of patients were hypokalemia, hypotension, hypoxia, sepsis, ALT increase, and decreases in lymphocyte, neutrophil, platelet, and white blood cell counts.

Eight deaths were reported, four of which were associated with treatment. Causes of treatment-related death were sepsis after prolonged myelosuppression, respiratory failure from pulmonary fibrosis, and central nervous system necrosis (n = 2). One patient with central nervous system necrosis had viral encephalitis and sepsis at the time of death.

“ACNS0333 has shown that intensive multimodal therapy significantly improves survival for patients with AT/RT,” the investigators concluded. “However, further intensification using cytotoxic agents is likely not feasible. There are increasing data suggesting that AT/RT may be a good candidate for pathway-specific targeted therapies.”

The study was funded by the Children’s Oncology Group, the National Institutes of Health, the St. Baldrick’s Foundation, the Canadian Cancer Society, and the Children’s of Alabama Kaul Pediatric Research Institute. The investigators disclosed relationships with Novartis, AstraZeneca, and Merck Sharp & Dohme.

SOURCE: Reddy AT et al. J Clin Oncol. 2020 Feb 27. doi: 10.1200/JCO.19.01776.

Publications
Topics
Sections

 

Intensive postoperative chemotherapy and focal radiation may improve event-free survival (EFS) among patients with atypical teratoid/rhabdoid tumors (AT/RT), results of the phase 3 ACNS0333 trial suggest.

Compared with historical therapies, the treatment protocol reduced the risk of EFS events by 57%, reported lead author Alyssa T. Reddy, MD, of the University of California San Francisco Benioff Children’s Hospital, and colleagues, who also noted that this is the first AT/RT-specific cooperative group trial.

“Case series and retrospective data suggested high-dose chemotherapy with peripheral blood stem cell (PBSC) rescue, early radiation therapy, and methotrexate had activity against AT/RT,” the investigators wrote in the Journal of Clinical Oncology.

Based on these findings, the investigators designed the ACNS0333 treatment protocol. Following surgery, all patients received two cycles of induction with methotrexate, vincristine, etoposide, cyclophosphamide, and cisplatin. They then underwent PBSC harvest.

Next, patients were divided into two groups based on age, disease location, and extent. Conformal radiotherapy was given between induction and consolidation to patients who were at least 6 months of age with tumor localized to the infratentorial brain or at least 12 months of age with tumor localized to the supratentorial brain. For younger patients or those with metastatic disease, radiotherapy was administered after consolidation was complete. Consolidation consisted of three cycles of thiotepa and carboplatin with PBSC support.

In addition to efficacy and safety analyses, molecular testing was performed on frozen tumor tissue and blood. This enabled a retrospective exploratory analysis of methylation profiles, which were used to subtype disease into three molecular classes: 2A/TYR, 2B/MYC, or 1/SHH-NOTCH.
 

Patient characteristics and results

The evaluable cohort included 65 patients, most of whom (83%) were younger than 36 months of age at baseline. About half of patients (51%) had infratentorial tumors, slightly fewer (40%) had supratentorial tumors, and 7.5% had a contiguous tumor in both locations. About one-third of patients (37%) had metastatic disease, and almost two-thirds (62%) had residual disease after surgery.

The median follow-up was 4.7 years. At 4 years, patients had an EFS rate of 35%. For patients aged 36 months or older, the 4-year EFS rate was higher still, at 48%.

These EFS rates compared favorably with the 6.4% EFS rate observed in a historical cohort of patients from the CCG-9921 trial (J Clin Oncol. 2005 Oct 20;23[30]:7621-31) and the POG-9233/4 trial (Neuro Oncol. 2014 Mar;16[3]:457-65). Overall, there was a 57% reduction in risk of EFS events in the ACNS0333 cohort compared with the historical controls (hazard ratio, 0.43; P less than .0005).

The 4-year overall survival rate was 43% for the entire ACNS0333 cohort and 57% for patients aged 36 months or older. Looking at molecular subtypes, patients with 1/SHH-NOTCH tumors had the best 4-year overall survival rate, at 56%, compared with 41% for 2A/TYR and 27% for 2B/MYC.

Adverse events were predominantly hematologic or infectious events reported during the induction and consolidation phases. Grade 4 or higher adverse events that occurred in at least 5% of patients were hypokalemia, hypotension, hypoxia, sepsis, ALT increase, and decreases in lymphocyte, neutrophil, platelet, and white blood cell counts.

Eight deaths were reported, four of which were associated with treatment. Causes of treatment-related death were sepsis after prolonged myelosuppression, respiratory failure from pulmonary fibrosis, and central nervous system necrosis (n = 2). One patient with central nervous system necrosis had viral encephalitis and sepsis at the time of death.

“ACNS0333 has shown that intensive multimodal therapy significantly improves survival for patients with AT/RT,” the investigators concluded. “However, further intensification using cytotoxic agents is likely not feasible. There are increasing data suggesting that AT/RT may be a good candidate for pathway-specific targeted therapies.”

The study was funded by the Children’s Oncology Group, the National Institutes of Health, the St. Baldrick’s Foundation, the Canadian Cancer Society, and the Children’s of Alabama Kaul Pediatric Research Institute. The investigators disclosed relationships with Novartis, AstraZeneca, and Merck Sharp & Dohme.

SOURCE: Reddy AT et al. J Clin Oncol. 2020 Feb 27. doi: 10.1200/JCO.19.01776.

 

Intensive postoperative chemotherapy and focal radiation may improve event-free survival (EFS) among patients with atypical teratoid/rhabdoid tumors (AT/RT), results of the phase 3 ACNS0333 trial suggest.

Compared with historical therapies, the treatment protocol reduced the risk of EFS events by 57%, reported lead author Alyssa T. Reddy, MD, of the University of California San Francisco Benioff Children’s Hospital, and colleagues, who also noted that this is the first AT/RT-specific cooperative group trial.

“Case series and retrospective data suggested high-dose chemotherapy with peripheral blood stem cell (PBSC) rescue, early radiation therapy, and methotrexate had activity against AT/RT,” the investigators wrote in the Journal of Clinical Oncology.

Based on these findings, the investigators designed the ACNS0333 treatment protocol. Following surgery, all patients received two cycles of induction with methotrexate, vincristine, etoposide, cyclophosphamide, and cisplatin. They then underwent PBSC harvest.

Next, patients were divided into two groups based on age, disease location, and extent. Conformal radiotherapy was given between induction and consolidation to patients who were at least 6 months of age with tumor localized to the infratentorial brain or at least 12 months of age with tumor localized to the supratentorial brain. For younger patients or those with metastatic disease, radiotherapy was administered after consolidation was complete. Consolidation consisted of three cycles of thiotepa and carboplatin with PBSC support.

In addition to efficacy and safety analyses, molecular testing was performed on frozen tumor tissue and blood. This enabled a retrospective exploratory analysis of methylation profiles, which were used to subtype disease into three molecular classes: 2A/TYR, 2B/MYC, or 1/SHH-NOTCH.
 

Patient characteristics and results

The evaluable cohort included 65 patients, most of whom (83%) were younger than 36 months of age at baseline. About half of patients (51%) had infratentorial tumors, slightly fewer (40%) had supratentorial tumors, and 7.5% had a contiguous tumor in both locations. About one-third of patients (37%) had metastatic disease, and almost two-thirds (62%) had residual disease after surgery.

The median follow-up was 4.7 years. At 4 years, patients had an EFS rate of 35%. For patients aged 36 months or older, the 4-year EFS rate was higher still, at 48%.

These EFS rates compared favorably with the 6.4% EFS rate observed in a historical cohort of patients from the CCG-9921 trial (J Clin Oncol. 2005 Oct 20;23[30]:7621-31) and the POG-9233/4 trial (Neuro Oncol. 2014 Mar;16[3]:457-65). Overall, there was a 57% reduction in risk of EFS events in the ACNS0333 cohort compared with the historical controls (hazard ratio, 0.43; P less than .0005).

The 4-year overall survival rate was 43% for the entire ACNS0333 cohort and 57% for patients aged 36 months or older. Looking at molecular subtypes, patients with 1/SHH-NOTCH tumors had the best 4-year overall survival rate, at 56%, compared with 41% for 2A/TYR and 27% for 2B/MYC.

Adverse events were predominantly hematologic or infectious events reported during the induction and consolidation phases. Grade 4 or higher adverse events that occurred in at least 5% of patients were hypokalemia, hypotension, hypoxia, sepsis, ALT increase, and decreases in lymphocyte, neutrophil, platelet, and white blood cell counts.

Eight deaths were reported, four of which were associated with treatment. Causes of treatment-related death were sepsis after prolonged myelosuppression, respiratory failure from pulmonary fibrosis, and central nervous system necrosis (n = 2). One patient with central nervous system necrosis had viral encephalitis and sepsis at the time of death.

“ACNS0333 has shown that intensive multimodal therapy significantly improves survival for patients with AT/RT,” the investigators concluded. “However, further intensification using cytotoxic agents is likely not feasible. There are increasing data suggesting that AT/RT may be a good candidate for pathway-specific targeted therapies.”

The study was funded by the Children’s Oncology Group, the National Institutes of Health, the St. Baldrick’s Foundation, the Canadian Cancer Society, and the Children’s of Alabama Kaul Pediatric Research Institute. The investigators disclosed relationships with Novartis, AstraZeneca, and Merck Sharp & Dohme.

SOURCE: Reddy AT et al. J Clin Oncol. 2020 Feb 27. doi: 10.1200/JCO.19.01776.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE JOURNAL OF CLINICAL ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

E. coli strain directly linked with CRC mutational signature

Article Type
Changed
Wed, 05/26/2021 - 13:45
Display Headline
E. coli strain directly linked with CRC mutational signature

Individuals exposed to pks+ Escherichia coli may have an increased risk of colorectal cancer (CRC), which suggests that treating this genotoxic strain could potentially reduce risk of CRC, according to investigators.

While previous studies have demonstrated associations between various intestinal bacteria and CRC, this is the first study to show a direct link between exposure to a particular strain of bacteria and a unique mutational signature, reported lead author Cayetano Pleguezuelos-Manzano, of the Hubrecht Institute in Utrecht, Netherlands.

Recent studies showed that colibactin, a toxin produced by pks+ E. coli, causes a specific type of DNA damage, although the outcome of this damage remained unclear, the investigators wrote in Nature.

To look for a possible mutational signature resulting from this damage, the investigators used human intestinal organoids, which were established from primary crypt stem cells. A pks+ E. coli strain was microinjected into one group of organoids, while another E. coli strain (pks∆clbQ), which does not produce colibactin, was injected into a second group.

Immunofluorescence showed that the organoids exposed to the pks+ E. coli strain developed characteristic DNA damage, whereas the control group did not.

Next, the investigators repeatedly injected organoids with either pks+ E. coli, pks∆clbQ, or dye only. This experiment was conducted for 5 months to achieve long-term exposure. Whole genome sequencing showed that the pks+ E. coli group developed two unique mutational signatures: a single-base substitution (SBS-pks) and a small indel signature (ID-pks). Neither of the other two groups developed these signatures, which suggests that they were a direct consequence of exposure to pks+ E. coli.

To determine the prevalence of such mutational signatures in human patients, the investigators looked for the SBS-pks and ID-pks signatures in 5,876 human cancer genomes. One analysis involving 496 CRC metastases showed strong enrichment of both signatures, compared with other cancer types (P less than .0001). Another analysis involving 2,208 CRC tumors found that 5.0% and 4.4% of patients had SBS-pks and ID-pks enrichment, respectively.

“This study implies that detection and removal of pks+ E. coli, as well as re-evaluation of probiotic strains harboring the pks island, could decrease the risk of cancer in a large group of individuals,” the investigators concluded.

The study was funded by the Ministry of Education, Culture and Science of the government of the Netherlands. The investigators reported additional relationships with OrigiMed, Bayer, Janssen, and others.

SOURCE: Pleguezuelos-Manzano C et al. Nature. 2020 Feb 27. doi: 10.1038/s41586-020-2080-8.

Publications
Topics
Sections

Individuals exposed to pks+ Escherichia coli may have an increased risk of colorectal cancer (CRC), which suggests that treating this genotoxic strain could potentially reduce risk of CRC, according to investigators.

While previous studies have demonstrated associations between various intestinal bacteria and CRC, this is the first study to show a direct link between exposure to a particular strain of bacteria and a unique mutational signature, reported lead author Cayetano Pleguezuelos-Manzano, of the Hubrecht Institute in Utrecht, Netherlands.

Recent studies showed that colibactin, a toxin produced by pks+ E. coli, causes a specific type of DNA damage, although the outcome of this damage remained unclear, the investigators wrote in Nature.

To look for a possible mutational signature resulting from this damage, the investigators used human intestinal organoids, which were established from primary crypt stem cells. A pks+ E. coli strain was microinjected into one group of organoids, while another E. coli strain (pks∆clbQ), which does not produce colibactin, was injected into a second group.

Immunofluorescence showed that the organoids exposed to the pks+ E. coli strain developed characteristic DNA damage, whereas the control group did not.

Next, the investigators repeatedly injected organoids with either pks+ E. coli, pks∆clbQ, or dye only. This experiment was conducted for 5 months to achieve long-term exposure. Whole genome sequencing showed that the pks+ E. coli group developed two unique mutational signatures: a single-base substitution (SBS-pks) and a small indel signature (ID-pks). Neither of the other two groups developed these signatures, which suggests that they were a direct consequence of exposure to pks+ E. coli.

To determine the prevalence of such mutational signatures in human patients, the investigators looked for the SBS-pks and ID-pks signatures in 5,876 human cancer genomes. One analysis involving 496 CRC metastases showed strong enrichment of both signatures, compared with other cancer types (P less than .0001). Another analysis involving 2,208 CRC tumors found that 5.0% and 4.4% of patients had SBS-pks and ID-pks enrichment, respectively.

“This study implies that detection and removal of pks+ E. coli, as well as re-evaluation of probiotic strains harboring the pks island, could decrease the risk of cancer in a large group of individuals,” the investigators concluded.

The study was funded by the Ministry of Education, Culture and Science of the government of the Netherlands. The investigators reported additional relationships with OrigiMed, Bayer, Janssen, and others.

SOURCE: Pleguezuelos-Manzano C et al. Nature. 2020 Feb 27. doi: 10.1038/s41586-020-2080-8.

Individuals exposed to pks+ Escherichia coli may have an increased risk of colorectal cancer (CRC), which suggests that treating this genotoxic strain could potentially reduce risk of CRC, according to investigators.

While previous studies have demonstrated associations between various intestinal bacteria and CRC, this is the first study to show a direct link between exposure to a particular strain of bacteria and a unique mutational signature, reported lead author Cayetano Pleguezuelos-Manzano, of the Hubrecht Institute in Utrecht, Netherlands.

Recent studies showed that colibactin, a toxin produced by pks+ E. coli, causes a specific type of DNA damage, although the outcome of this damage remained unclear, the investigators wrote in Nature.

To look for a possible mutational signature resulting from this damage, the investigators used human intestinal organoids, which were established from primary crypt stem cells. A pks+ E. coli strain was microinjected into one group of organoids, while another E. coli strain (pks∆clbQ), which does not produce colibactin, was injected into a second group.

Immunofluorescence showed that the organoids exposed to the pks+ E. coli strain developed characteristic DNA damage, whereas the control group did not.

Next, the investigators repeatedly injected organoids with either pks+ E. coli, pks∆clbQ, or dye only. This experiment was conducted for 5 months to achieve long-term exposure. Whole genome sequencing showed that the pks+ E. coli group developed two unique mutational signatures: a single-base substitution (SBS-pks) and a small indel signature (ID-pks). Neither of the other two groups developed these signatures, which suggests that they were a direct consequence of exposure to pks+ E. coli.

To determine the prevalence of such mutational signatures in human patients, the investigators looked for the SBS-pks and ID-pks signatures in 5,876 human cancer genomes. One analysis involving 496 CRC metastases showed strong enrichment of both signatures, compared with other cancer types (P less than .0001). Another analysis involving 2,208 CRC tumors found that 5.0% and 4.4% of patients had SBS-pks and ID-pks enrichment, respectively.

“This study implies that detection and removal of pks+ E. coli, as well as re-evaluation of probiotic strains harboring the pks island, could decrease the risk of cancer in a large group of individuals,” the investigators concluded.

The study was funded by the Ministry of Education, Culture and Science of the government of the Netherlands. The investigators reported additional relationships with OrigiMed, Bayer, Janssen, and others.

SOURCE: Pleguezuelos-Manzano C et al. Nature. 2020 Feb 27. doi: 10.1038/s41586-020-2080-8.

Publications
Publications
Topics
Article Type
Display Headline
E. coli strain directly linked with CRC mutational signature
Display Headline
E. coli strain directly linked with CRC mutational signature
Click for Credit Status
Ready
Sections
Article Source

FROM NATURE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Team approach greatly reduces inappropriate PPI use

Article Type
Changed
Tue, 06/16/2020 - 14:33

A multidisciplinary, patient-centered approach can dramatically reduce inappropriate use of proton pump inhibitors (PPI), based on the success of a quality improvement study conducted at an academic, primary care clinic in New York.

In 1 year, the program reduced inappropriate PPI use from 80% to 30%, reported lead author Naren Nallapeta, MD, an internal medicine resident at the State University of New York at Buffalo, and colleagues.

According to the investigators, inappropriate use of PPIs is a common and growing concern. Recent patent expirations have led to wider availability of generic and over-the-counter options; and the consequences of unnecessary usage can be serious.

“PPI intake has been found to have a significant association with community-acquired pneumonia, Clostridium difficile associated diarrhea, impaired B12 absorption, hypomagnesemia, hip fractures, acute and chronic kidney disease, and spontaneous bacterial peritonitis in patients with cirrhotic ascites,” the investigators wrote in the Journal of Clinical Gastroenterology.

In 2017, the American Gastroenterological Association released guidelines that include indications for PPIs. But these guidelines often go unheeded, the investigators noted. Multiple studies have documented rates of inappropriate PPI use ranging from 54.1% to 82%. Previous studies have yielded mixed results: Simple physician education alone was found insufficient to reduce inappropriate prescriptions in a primary care setting, whereas studies involving pharmacy personnel have had positive results in various treatment centers.

In a survey of their own clinic, the Erie County Medical Center, an internal medicine service located in a tertiary care safety net hospital at the University at Buffalo, the investigators found that 80% of PPI prescriptions were inappropriate. This prompted a goal to reduce inappropriate use to less than 60% within 1 year.

To achieve this goal, the investigators started a quality improvement project based on the Plan-Do-Study-Act Cycle (PDSA) Model of health care improvement. The quality improvement team included internal medicine attending physicians and physician residents, gastroenterologists, patients, nurses, administrative and information technology staff, and a social worker.

After identifying root causes, the team deployed a variety of strategies to cut down on inappropriate PPI use. First, a new prompt in the electronic medical record reminded physicians to discuss PPI use with patients. Physicians were also given additional training concerning appropriate indications for PPIs, as well as a pocket guide and brochures that could be used for patient education. These efforts were supplemented by an enhanced nursing workflow, as well as continuous reinforcement with positive feedback for health care workers involved.

One year later, results exceeded expectations. Based on data from 180 patients, inappropriate use decreased from 80% to 30%, with a mean discontinuation rate of approximately 50%, an average that was maintained for 6 months beyond the end of the study. The annual, direct cost savings of the program totaled $13,992.

When indicated, patients on long-term PPIs were referred to the gastroenterology service for esophagogastroduodenoscopy. About half of the referred patients (49.8%) completed this procedure, which exceeded the baseline completion rate of less than 30%.

Dr. Smita Bakhai


According to principal author Smita Bakhai, MD, of the department of internal medicine at the University at Buffalo, and a physician with UBMB Internal Medicine, the strategies used in this study are broadly applicable.

“The multidisciplinary approach, including patient engagement, would work well in any setting, even with limited resources and without the use of pharmacy personnel,” Dr. Bakhai said in an interview. “We used a patient-centered approach and physicians used shared decision making with patients to taper and discontinue PPIs or switch them to an H2-blocking agent when [patients] did not need to be on chronic PPIs.”

Dr. Bakhai went on to highlight the importance of working with multiple stakeholders.

“This project is unique because it was a resident-led project,” she said. “As academicians, we should always engage the fellows and the residents in any quality improvement work that we do, because they are the doctors of the future.”

Beyond care providers and patients, Dr. Bakhai emphasized the need to involve administrative leadership who can guarantee long-term resources and cultivate the right culture.

“Without the resources, you can’t sustain the project,” Dr. Bakhai said. “You have to have allocated resources, and a culture of safety and quality in the environment that you are doing the project. It has to be a supportive environment. We had all of those things, and that’s why we succeeded.”

The study was funded by the National Institutes of Health. The investigators reported no conflicts of interest.

SOURCE: Nallapeta N et al. J Clin Gastroenterol. 2020 Feb 14. doi: 10.1097/MCG.0000000000001317.

Publications
Topics
Sections

A multidisciplinary, patient-centered approach can dramatically reduce inappropriate use of proton pump inhibitors (PPI), based on the success of a quality improvement study conducted at an academic, primary care clinic in New York.

In 1 year, the program reduced inappropriate PPI use from 80% to 30%, reported lead author Naren Nallapeta, MD, an internal medicine resident at the State University of New York at Buffalo, and colleagues.

According to the investigators, inappropriate use of PPIs is a common and growing concern. Recent patent expirations have led to wider availability of generic and over-the-counter options; and the consequences of unnecessary usage can be serious.

“PPI intake has been found to have a significant association with community-acquired pneumonia, Clostridium difficile associated diarrhea, impaired B12 absorption, hypomagnesemia, hip fractures, acute and chronic kidney disease, and spontaneous bacterial peritonitis in patients with cirrhotic ascites,” the investigators wrote in the Journal of Clinical Gastroenterology.

In 2017, the American Gastroenterological Association released guidelines that include indications for PPIs. But these guidelines often go unheeded, the investigators noted. Multiple studies have documented rates of inappropriate PPI use ranging from 54.1% to 82%. Previous studies have yielded mixed results: Simple physician education alone was found insufficient to reduce inappropriate prescriptions in a primary care setting, whereas studies involving pharmacy personnel have had positive results in various treatment centers.

In a survey of their own clinic, the Erie County Medical Center, an internal medicine service located in a tertiary care safety net hospital at the University at Buffalo, the investigators found that 80% of PPI prescriptions were inappropriate. This prompted a goal to reduce inappropriate use to less than 60% within 1 year.

To achieve this goal, the investigators started a quality improvement project based on the Plan-Do-Study-Act Cycle (PDSA) Model of health care improvement. The quality improvement team included internal medicine attending physicians and physician residents, gastroenterologists, patients, nurses, administrative and information technology staff, and a social worker.

After identifying root causes, the team deployed a variety of strategies to cut down on inappropriate PPI use. First, a new prompt in the electronic medical record reminded physicians to discuss PPI use with patients. Physicians were also given additional training concerning appropriate indications for PPIs, as well as a pocket guide and brochures that could be used for patient education. These efforts were supplemented by an enhanced nursing workflow, as well as continuous reinforcement with positive feedback for health care workers involved.

One year later, results exceeded expectations. Based on data from 180 patients, inappropriate use decreased from 80% to 30%, with a mean discontinuation rate of approximately 50%, an average that was maintained for 6 months beyond the end of the study. The annual, direct cost savings of the program totaled $13,992.

When indicated, patients on long-term PPIs were referred to the gastroenterology service for esophagogastroduodenoscopy. About half of the referred patients (49.8%) completed this procedure, which exceeded the baseline completion rate of less than 30%.

Dr. Smita Bakhai


According to principal author Smita Bakhai, MD, of the department of internal medicine at the University at Buffalo, and a physician with UBMB Internal Medicine, the strategies used in this study are broadly applicable.

“The multidisciplinary approach, including patient engagement, would work well in any setting, even with limited resources and without the use of pharmacy personnel,” Dr. Bakhai said in an interview. “We used a patient-centered approach and physicians used shared decision making with patients to taper and discontinue PPIs or switch them to an H2-blocking agent when [patients] did not need to be on chronic PPIs.”

Dr. Bakhai went on to highlight the importance of working with multiple stakeholders.

“This project is unique because it was a resident-led project,” she said. “As academicians, we should always engage the fellows and the residents in any quality improvement work that we do, because they are the doctors of the future.”

Beyond care providers and patients, Dr. Bakhai emphasized the need to involve administrative leadership who can guarantee long-term resources and cultivate the right culture.

“Without the resources, you can’t sustain the project,” Dr. Bakhai said. “You have to have allocated resources, and a culture of safety and quality in the environment that you are doing the project. It has to be a supportive environment. We had all of those things, and that’s why we succeeded.”

The study was funded by the National Institutes of Health. The investigators reported no conflicts of interest.

SOURCE: Nallapeta N et al. J Clin Gastroenterol. 2020 Feb 14. doi: 10.1097/MCG.0000000000001317.

A multidisciplinary, patient-centered approach can dramatically reduce inappropriate use of proton pump inhibitors (PPI), based on the success of a quality improvement study conducted at an academic, primary care clinic in New York.

In 1 year, the program reduced inappropriate PPI use from 80% to 30%, reported lead author Naren Nallapeta, MD, an internal medicine resident at the State University of New York at Buffalo, and colleagues.

According to the investigators, inappropriate use of PPIs is a common and growing concern. Recent patent expirations have led to wider availability of generic and over-the-counter options; and the consequences of unnecessary usage can be serious.

“PPI intake has been found to have a significant association with community-acquired pneumonia, Clostridium difficile associated diarrhea, impaired B12 absorption, hypomagnesemia, hip fractures, acute and chronic kidney disease, and spontaneous bacterial peritonitis in patients with cirrhotic ascites,” the investigators wrote in the Journal of Clinical Gastroenterology.

In 2017, the American Gastroenterological Association released guidelines that include indications for PPIs. But these guidelines often go unheeded, the investigators noted. Multiple studies have documented rates of inappropriate PPI use ranging from 54.1% to 82%. Previous studies have yielded mixed results: Simple physician education alone was found insufficient to reduce inappropriate prescriptions in a primary care setting, whereas studies involving pharmacy personnel have had positive results in various treatment centers.

In a survey of their own clinic, the Erie County Medical Center, an internal medicine service located in a tertiary care safety net hospital at the University at Buffalo, the investigators found that 80% of PPI prescriptions were inappropriate. This prompted a goal to reduce inappropriate use to less than 60% within 1 year.

To achieve this goal, the investigators started a quality improvement project based on the Plan-Do-Study-Act Cycle (PDSA) Model of health care improvement. The quality improvement team included internal medicine attending physicians and physician residents, gastroenterologists, patients, nurses, administrative and information technology staff, and a social worker.

After identifying root causes, the team deployed a variety of strategies to cut down on inappropriate PPI use. First, a new prompt in the electronic medical record reminded physicians to discuss PPI use with patients. Physicians were also given additional training concerning appropriate indications for PPIs, as well as a pocket guide and brochures that could be used for patient education. These efforts were supplemented by an enhanced nursing workflow, as well as continuous reinforcement with positive feedback for health care workers involved.

One year later, results exceeded expectations. Based on data from 180 patients, inappropriate use decreased from 80% to 30%, with a mean discontinuation rate of approximately 50%, an average that was maintained for 6 months beyond the end of the study. The annual, direct cost savings of the program totaled $13,992.

When indicated, patients on long-term PPIs were referred to the gastroenterology service for esophagogastroduodenoscopy. About half of the referred patients (49.8%) completed this procedure, which exceeded the baseline completion rate of less than 30%.

Dr. Smita Bakhai


According to principal author Smita Bakhai, MD, of the department of internal medicine at the University at Buffalo, and a physician with UBMB Internal Medicine, the strategies used in this study are broadly applicable.

“The multidisciplinary approach, including patient engagement, would work well in any setting, even with limited resources and without the use of pharmacy personnel,” Dr. Bakhai said in an interview. “We used a patient-centered approach and physicians used shared decision making with patients to taper and discontinue PPIs or switch them to an H2-blocking agent when [patients] did not need to be on chronic PPIs.”

Dr. Bakhai went on to highlight the importance of working with multiple stakeholders.

“This project is unique because it was a resident-led project,” she said. “As academicians, we should always engage the fellows and the residents in any quality improvement work that we do, because they are the doctors of the future.”

Beyond care providers and patients, Dr. Bakhai emphasized the need to involve administrative leadership who can guarantee long-term resources and cultivate the right culture.

“Without the resources, you can’t sustain the project,” Dr. Bakhai said. “You have to have allocated resources, and a culture of safety and quality in the environment that you are doing the project. It has to be a supportive environment. We had all of those things, and that’s why we succeeded.”

The study was funded by the National Institutes of Health. The investigators reported no conflicts of interest.

SOURCE: Nallapeta N et al. J Clin Gastroenterol. 2020 Feb 14. doi: 10.1097/MCG.0000000000001317.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM JOURNAL OF CLINICAL GASTROENTEROLGOY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
218110
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge

SOLO3: Olaparib outperforms chemo in heavily pretreated ovarian cancer

Article Type
Changed
Wed, 02/26/2020 - 13:17

For heavily pretreated patients with BRCA-mutated ovarian cancer, olaparib is more effective than nonplatinum chemotherapy, according to results from the SOLO3 trial.

Both objective response rate and progression-free survival were significantly better in the olaparib group, reported lead author Richard T. Penson, MD, of Massachusetts General Hospital, Boston, and colleagues. These findings were published in the Journal of Clinical Oncology.

SOLO3 provides much-needed data for an understudied but common group of patients, according to Kathleen Moore, MD, lead author of the previous SOLO1 trial and associate director of clinical research at the Stephenson Cancer Center at the University of Oklahoma in Oklahoma City.

Dr. Kathleen Moore

“[Approximately] 50% of the women who participated in SOLO3 were on their fourth or more line of chemo,” Dr. Moore said in an interview. “That group of patients has not been studied in the past ... Even though we know many women receive many, many lines of chemotherapy, we really don’t have high quality clinical trial data to give women any indication of what to expect in terms of their response rate.”

SOLO3 is also notable, Dr. Moore said, because it is the first phase 3 trial to compare a PARP inhibitor with doctor’s choice chemotherapy.

SOLO3 involved 266 patients with BRCA-mutated, relapsed ovarian cancer who had received two or more lines of platinum-based chemotherapy. All patients were platinum sensitive (progression more than 12 months after last platinum-based treatment) or partially platinum sensitive (progression within 6-12 months of last platinum-based treatment).

After enrollment, participants were randomized in a 2:1 ratio to receive either olaparib (300 mg twice daily) or physician’s choice, single-agent, nonplatinum chemotherapy (pegylated liposomal doxorubicin, paclitaxel, gemcitabine, or topotecan). The primary endpoint was objective response rate, determined by blinded independent central review. Secondary endpoints included progression-free survival and overall survival.

At a median follow-up of 13.8 months in the olaparib group and 3.9 months in the chemotherapy group, olaparib showed a significant efficacy advantage. Among 223 patients with measurable disease, the objective response rate was 72.2% in the olaparib group and 51.4% in the chemotherapy group (odds ratio, 2.53; P = .002).

The superiority of olaparib was maintained in multiple subgroups of patients, including those who had received only two prior lines of therapy (OR, 3.44) and those who had three or more prior lines (OR, 2.21). Similar benefits were observed regardless of age.

Across all patients, the median progression-free survival was significantly better in the olaparib group, at 13.4 months, versus 9.2 months in the chemotherapy group (P = .013). Overall survival data were immature.

No new safety signals were encountered. The most common adverse events (AEs) in the olaparib group were nausea, fatigue/asthenia, anemia, vomiting, and diarrhea. The most common AEs in the chemotherapy group were fatigue/asthenia, palmar-plantar erythrodysesthesia, nausea, neutropenia, and anemia.

The rate of serious AEs was 23.6% in the olaparib group and 18.4% in the chemotherapy group. Three patients in the olaparib group, and none in the chemotherapy group, developed new primary malignancies. There were four patients with fatal AEs in the olaparib group (myelodysplastic syndrome, cardiopulmonary decompensation, sepsis, and acute myeloid leukemia and subarachnoid hemorrhage), and there was one fatal AE in the chemotherapy group (mesenteric vein thrombosis).

Dr. Moore pointed out that these findings are relevant to current clinical practice, but a shifting treatment landscape may soon render SOLO3 data obsolete.

“PARP [inhibitors] are likely moving into the frontline,” Dr. Moore said. “So this population of women who have not received a PARP [inhibitor] and are recurrent is still here, and they will be for several years, but there’s going to be a point when they’re not going to be here anymore, because they’ll all have received [a PARP inhibitor] front line.”

Concerning the broader research landscape for PARP inhibitors, Dr. Moore suggested that investigators are currently in a “waiting period” while the Food and Drug Administration and European Medicines Agency interpret major clinical trials, such as PAOLA-1 and PRIMA.

A variety of patient populations and clinical scenarios remain unevaluated, Dr. Moore said, including patients who don’t respond to PARP inhibitors, those who have disease recurrence while taking PARP inhibitors, and which drug combinations to use for which patients.

“There are a lot of irons in the fire right now, just getting ready to launch, but I think we need to see what the population is that’s going to be exposed to PARP [inhibitors] next, so we can design the next round of studies,” Dr. Moore said. “It’s an exciting time.”

SOLO3 was funded by AstraZeneca and Merck. The investigators reported additional relationships with Clovis Oncology, Eisai, Tesaro, and other companies. Dr. Moore disclosed relationships with Genentech, Immunogen, Mersana, and other companies.

SOURCE: Penson et al. J Clin Oncol. 2020 Feb 19. doi: 10.1200/JCO.19.02745.

Publications
Topics
Sections

For heavily pretreated patients with BRCA-mutated ovarian cancer, olaparib is more effective than nonplatinum chemotherapy, according to results from the SOLO3 trial.

Both objective response rate and progression-free survival were significantly better in the olaparib group, reported lead author Richard T. Penson, MD, of Massachusetts General Hospital, Boston, and colleagues. These findings were published in the Journal of Clinical Oncology.

SOLO3 provides much-needed data for an understudied but common group of patients, according to Kathleen Moore, MD, lead author of the previous SOLO1 trial and associate director of clinical research at the Stephenson Cancer Center at the University of Oklahoma in Oklahoma City.

Dr. Kathleen Moore

“[Approximately] 50% of the women who participated in SOLO3 were on their fourth or more line of chemo,” Dr. Moore said in an interview. “That group of patients has not been studied in the past ... Even though we know many women receive many, many lines of chemotherapy, we really don’t have high quality clinical trial data to give women any indication of what to expect in terms of their response rate.”

SOLO3 is also notable, Dr. Moore said, because it is the first phase 3 trial to compare a PARP inhibitor with doctor’s choice chemotherapy.

SOLO3 involved 266 patients with BRCA-mutated, relapsed ovarian cancer who had received two or more lines of platinum-based chemotherapy. All patients were platinum sensitive (progression more than 12 months after last platinum-based treatment) or partially platinum sensitive (progression within 6-12 months of last platinum-based treatment).

After enrollment, participants were randomized in a 2:1 ratio to receive either olaparib (300 mg twice daily) or physician’s choice, single-agent, nonplatinum chemotherapy (pegylated liposomal doxorubicin, paclitaxel, gemcitabine, or topotecan). The primary endpoint was objective response rate, determined by blinded independent central review. Secondary endpoints included progression-free survival and overall survival.

At a median follow-up of 13.8 months in the olaparib group and 3.9 months in the chemotherapy group, olaparib showed a significant efficacy advantage. Among 223 patients with measurable disease, the objective response rate was 72.2% in the olaparib group and 51.4% in the chemotherapy group (odds ratio, 2.53; P = .002).

The superiority of olaparib was maintained in multiple subgroups of patients, including those who had received only two prior lines of therapy (OR, 3.44) and those who had three or more prior lines (OR, 2.21). Similar benefits were observed regardless of age.

Across all patients, the median progression-free survival was significantly better in the olaparib group, at 13.4 months, versus 9.2 months in the chemotherapy group (P = .013). Overall survival data were immature.

No new safety signals were encountered. The most common adverse events (AEs) in the olaparib group were nausea, fatigue/asthenia, anemia, vomiting, and diarrhea. The most common AEs in the chemotherapy group were fatigue/asthenia, palmar-plantar erythrodysesthesia, nausea, neutropenia, and anemia.

The rate of serious AEs was 23.6% in the olaparib group and 18.4% in the chemotherapy group. Three patients in the olaparib group, and none in the chemotherapy group, developed new primary malignancies. There were four patients with fatal AEs in the olaparib group (myelodysplastic syndrome, cardiopulmonary decompensation, sepsis, and acute myeloid leukemia and subarachnoid hemorrhage), and there was one fatal AE in the chemotherapy group (mesenteric vein thrombosis).

Dr. Moore pointed out that these findings are relevant to current clinical practice, but a shifting treatment landscape may soon render SOLO3 data obsolete.

“PARP [inhibitors] are likely moving into the frontline,” Dr. Moore said. “So this population of women who have not received a PARP [inhibitor] and are recurrent is still here, and they will be for several years, but there’s going to be a point when they’re not going to be here anymore, because they’ll all have received [a PARP inhibitor] front line.”

Concerning the broader research landscape for PARP inhibitors, Dr. Moore suggested that investigators are currently in a “waiting period” while the Food and Drug Administration and European Medicines Agency interpret major clinical trials, such as PAOLA-1 and PRIMA.

A variety of patient populations and clinical scenarios remain unevaluated, Dr. Moore said, including patients who don’t respond to PARP inhibitors, those who have disease recurrence while taking PARP inhibitors, and which drug combinations to use for which patients.

“There are a lot of irons in the fire right now, just getting ready to launch, but I think we need to see what the population is that’s going to be exposed to PARP [inhibitors] next, so we can design the next round of studies,” Dr. Moore said. “It’s an exciting time.”

SOLO3 was funded by AstraZeneca and Merck. The investigators reported additional relationships with Clovis Oncology, Eisai, Tesaro, and other companies. Dr. Moore disclosed relationships with Genentech, Immunogen, Mersana, and other companies.

SOURCE: Penson et al. J Clin Oncol. 2020 Feb 19. doi: 10.1200/JCO.19.02745.

For heavily pretreated patients with BRCA-mutated ovarian cancer, olaparib is more effective than nonplatinum chemotherapy, according to results from the SOLO3 trial.

Both objective response rate and progression-free survival were significantly better in the olaparib group, reported lead author Richard T. Penson, MD, of Massachusetts General Hospital, Boston, and colleagues. These findings were published in the Journal of Clinical Oncology.

SOLO3 provides much-needed data for an understudied but common group of patients, according to Kathleen Moore, MD, lead author of the previous SOLO1 trial and associate director of clinical research at the Stephenson Cancer Center at the University of Oklahoma in Oklahoma City.

Dr. Kathleen Moore

“[Approximately] 50% of the women who participated in SOLO3 were on their fourth or more line of chemo,” Dr. Moore said in an interview. “That group of patients has not been studied in the past ... Even though we know many women receive many, many lines of chemotherapy, we really don’t have high quality clinical trial data to give women any indication of what to expect in terms of their response rate.”

SOLO3 is also notable, Dr. Moore said, because it is the first phase 3 trial to compare a PARP inhibitor with doctor’s choice chemotherapy.

SOLO3 involved 266 patients with BRCA-mutated, relapsed ovarian cancer who had received two or more lines of platinum-based chemotherapy. All patients were platinum sensitive (progression more than 12 months after last platinum-based treatment) or partially platinum sensitive (progression within 6-12 months of last platinum-based treatment).

After enrollment, participants were randomized in a 2:1 ratio to receive either olaparib (300 mg twice daily) or physician’s choice, single-agent, nonplatinum chemotherapy (pegylated liposomal doxorubicin, paclitaxel, gemcitabine, or topotecan). The primary endpoint was objective response rate, determined by blinded independent central review. Secondary endpoints included progression-free survival and overall survival.

At a median follow-up of 13.8 months in the olaparib group and 3.9 months in the chemotherapy group, olaparib showed a significant efficacy advantage. Among 223 patients with measurable disease, the objective response rate was 72.2% in the olaparib group and 51.4% in the chemotherapy group (odds ratio, 2.53; P = .002).

The superiority of olaparib was maintained in multiple subgroups of patients, including those who had received only two prior lines of therapy (OR, 3.44) and those who had three or more prior lines (OR, 2.21). Similar benefits were observed regardless of age.

Across all patients, the median progression-free survival was significantly better in the olaparib group, at 13.4 months, versus 9.2 months in the chemotherapy group (P = .013). Overall survival data were immature.

No new safety signals were encountered. The most common adverse events (AEs) in the olaparib group were nausea, fatigue/asthenia, anemia, vomiting, and diarrhea. The most common AEs in the chemotherapy group were fatigue/asthenia, palmar-plantar erythrodysesthesia, nausea, neutropenia, and anemia.

The rate of serious AEs was 23.6% in the olaparib group and 18.4% in the chemotherapy group. Three patients in the olaparib group, and none in the chemotherapy group, developed new primary malignancies. There were four patients with fatal AEs in the olaparib group (myelodysplastic syndrome, cardiopulmonary decompensation, sepsis, and acute myeloid leukemia and subarachnoid hemorrhage), and there was one fatal AE in the chemotherapy group (mesenteric vein thrombosis).

Dr. Moore pointed out that these findings are relevant to current clinical practice, but a shifting treatment landscape may soon render SOLO3 data obsolete.

“PARP [inhibitors] are likely moving into the frontline,” Dr. Moore said. “So this population of women who have not received a PARP [inhibitor] and are recurrent is still here, and they will be for several years, but there’s going to be a point when they’re not going to be here anymore, because they’ll all have received [a PARP inhibitor] front line.”

Concerning the broader research landscape for PARP inhibitors, Dr. Moore suggested that investigators are currently in a “waiting period” while the Food and Drug Administration and European Medicines Agency interpret major clinical trials, such as PAOLA-1 and PRIMA.

A variety of patient populations and clinical scenarios remain unevaluated, Dr. Moore said, including patients who don’t respond to PARP inhibitors, those who have disease recurrence while taking PARP inhibitors, and which drug combinations to use for which patients.

“There are a lot of irons in the fire right now, just getting ready to launch, but I think we need to see what the population is that’s going to be exposed to PARP [inhibitors] next, so we can design the next round of studies,” Dr. Moore said. “It’s an exciting time.”

SOLO3 was funded by AstraZeneca and Merck. The investigators reported additional relationships with Clovis Oncology, Eisai, Tesaro, and other companies. Dr. Moore disclosed relationships with Genentech, Immunogen, Mersana, and other companies.

SOURCE: Penson et al. J Clin Oncol. 2020 Feb 19. doi: 10.1200/JCO.19.02745.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JOURNAL OF CLINICAL ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Tumor neoantigenicity metric improves prediction of response to immunotherapy

Article Type
Changed
Mon, 02/24/2020 - 09:43

A new tumor neoantigenicity metric may improve prediction of response to immunotherapy in patients with melanoma, lung cancer, and kidney cancer, a retrospective analysis suggests.

The new metric, known as the Cauchy-Schwarz index of neoantigens (CSiN) score, incorporates both immunogenicity and clonality, according to lead study author Tianshi Lu, a PhD candidate at the University of Texas Southwestern Medical Center in Dallas, and colleagues.

“The major biological insight from this study is that the neoantigen clonal structure in each tumor specimen and the immunogenicity of the neoantigens (represented by the MHC-binding strength in our study) are predictive of response to checkpoint inhibitors and prognosis,” the investigators wrote in Science Immunology.

The study involved 2,479 patients with various cancers, including immunogenic types such as renal cell carcinoma (RCC), and nonimmunogenic types, such as pediatric acute lymphocytic leukemia.

The investigators first evaluated CSiN in relation to clinical outcome among patients with immunogenic cancers who received immunotherapy. Drawing data from multiple cohorts, the investigators found that patients who had better responses to therapy were significantly more likely to have above average CSiN scores than those who had worse responses.

In one cohort of patients with melanoma who received anti–CTLA-4 therapy, those with better responses were more likely to have high CSiN scores (P = .009). In another cohort of melanoma patients who received anti–CTLA-4 therapy, those with higher CSiN scores were more likely to achieve durable clinical benefit (response or stable disease for more than 6 months), compared with patients who had lower CSiN scores (P = .033).

Among patients with clear cell RCC treated with anti-PD-1/PD-L1 therapy, there was a significant positive association between higher CSiN scores and better response (P = .036). Among T effector-high patients with metastatic clear cell RCC, there was a significant association between higher CSiN scores and better response to atezolizumab (P = .028) but not sunitinib (P = .890).

 

 


In a cohort of patients with non–small cell lung cancer treated with checkpoint inhibitors, those with sustained responses were more likely to have higher CSiN scores than were patients with short-term progression (P = .015).

The investigators also compared the predictive power of CSiN with existing neoantigenicity metrics, ultimately concluding that CSiN was superior.

“Overall, the neoantigen load and neoantigen fitness models were not as strongly predictive of treatment response as CSiN,” the investigators wrote.

Again using data from patients with immunogenic cancers, the investigators looked for an association between CSiN score and overall survival. Indeed, patients with higher-than-average CSiN scores had significantly better survival than that of those with lower scores (P less than .001). This finding was maintained in a multivariate analysis that accounted for disease type, stage, sex, and age.

In contrast with the above findings, CSiN did not predict survival among patients with nonimmunogenic cancer types.

“Overall, our work offers a rigorous methodology of predicting response to immunotherapy and prognosis from routine patient samples and should be useful for personalizing medicine in the modern era of immunotherapy,” the investigators concluded.

The study was funded by the National Institutes of Health, the Cancer Prevention Research Institute of Texas, and the American Cancer Society. The investigators reported no conflicts of interest.

SOURCE: Lu et al. Sci Immunol. 2020 Feb 21. doi: 10.1126/sciimmunol.aaz3199.

Publications
Topics
Sections

A new tumor neoantigenicity metric may improve prediction of response to immunotherapy in patients with melanoma, lung cancer, and kidney cancer, a retrospective analysis suggests.

The new metric, known as the Cauchy-Schwarz index of neoantigens (CSiN) score, incorporates both immunogenicity and clonality, according to lead study author Tianshi Lu, a PhD candidate at the University of Texas Southwestern Medical Center in Dallas, and colleagues.

“The major biological insight from this study is that the neoantigen clonal structure in each tumor specimen and the immunogenicity of the neoantigens (represented by the MHC-binding strength in our study) are predictive of response to checkpoint inhibitors and prognosis,” the investigators wrote in Science Immunology.

The study involved 2,479 patients with various cancers, including immunogenic types such as renal cell carcinoma (RCC), and nonimmunogenic types, such as pediatric acute lymphocytic leukemia.

The investigators first evaluated CSiN in relation to clinical outcome among patients with immunogenic cancers who received immunotherapy. Drawing data from multiple cohorts, the investigators found that patients who had better responses to therapy were significantly more likely to have above average CSiN scores than those who had worse responses.

In one cohort of patients with melanoma who received anti–CTLA-4 therapy, those with better responses were more likely to have high CSiN scores (P = .009). In another cohort of melanoma patients who received anti–CTLA-4 therapy, those with higher CSiN scores were more likely to achieve durable clinical benefit (response or stable disease for more than 6 months), compared with patients who had lower CSiN scores (P = .033).

Among patients with clear cell RCC treated with anti-PD-1/PD-L1 therapy, there was a significant positive association between higher CSiN scores and better response (P = .036). Among T effector-high patients with metastatic clear cell RCC, there was a significant association between higher CSiN scores and better response to atezolizumab (P = .028) but not sunitinib (P = .890).

 

 


In a cohort of patients with non–small cell lung cancer treated with checkpoint inhibitors, those with sustained responses were more likely to have higher CSiN scores than were patients with short-term progression (P = .015).

The investigators also compared the predictive power of CSiN with existing neoantigenicity metrics, ultimately concluding that CSiN was superior.

“Overall, the neoantigen load and neoantigen fitness models were not as strongly predictive of treatment response as CSiN,” the investigators wrote.

Again using data from patients with immunogenic cancers, the investigators looked for an association between CSiN score and overall survival. Indeed, patients with higher-than-average CSiN scores had significantly better survival than that of those with lower scores (P less than .001). This finding was maintained in a multivariate analysis that accounted for disease type, stage, sex, and age.

In contrast with the above findings, CSiN did not predict survival among patients with nonimmunogenic cancer types.

“Overall, our work offers a rigorous methodology of predicting response to immunotherapy and prognosis from routine patient samples and should be useful for personalizing medicine in the modern era of immunotherapy,” the investigators concluded.

The study was funded by the National Institutes of Health, the Cancer Prevention Research Institute of Texas, and the American Cancer Society. The investigators reported no conflicts of interest.

SOURCE: Lu et al. Sci Immunol. 2020 Feb 21. doi: 10.1126/sciimmunol.aaz3199.

A new tumor neoantigenicity metric may improve prediction of response to immunotherapy in patients with melanoma, lung cancer, and kidney cancer, a retrospective analysis suggests.

The new metric, known as the Cauchy-Schwarz index of neoantigens (CSiN) score, incorporates both immunogenicity and clonality, according to lead study author Tianshi Lu, a PhD candidate at the University of Texas Southwestern Medical Center in Dallas, and colleagues.

“The major biological insight from this study is that the neoantigen clonal structure in each tumor specimen and the immunogenicity of the neoantigens (represented by the MHC-binding strength in our study) are predictive of response to checkpoint inhibitors and prognosis,” the investigators wrote in Science Immunology.

The study involved 2,479 patients with various cancers, including immunogenic types such as renal cell carcinoma (RCC), and nonimmunogenic types, such as pediatric acute lymphocytic leukemia.

The investigators first evaluated CSiN in relation to clinical outcome among patients with immunogenic cancers who received immunotherapy. Drawing data from multiple cohorts, the investigators found that patients who had better responses to therapy were significantly more likely to have above average CSiN scores than those who had worse responses.

In one cohort of patients with melanoma who received anti–CTLA-4 therapy, those with better responses were more likely to have high CSiN scores (P = .009). In another cohort of melanoma patients who received anti–CTLA-4 therapy, those with higher CSiN scores were more likely to achieve durable clinical benefit (response or stable disease for more than 6 months), compared with patients who had lower CSiN scores (P = .033).

Among patients with clear cell RCC treated with anti-PD-1/PD-L1 therapy, there was a significant positive association between higher CSiN scores and better response (P = .036). Among T effector-high patients with metastatic clear cell RCC, there was a significant association between higher CSiN scores and better response to atezolizumab (P = .028) but not sunitinib (P = .890).

 

 


In a cohort of patients with non–small cell lung cancer treated with checkpoint inhibitors, those with sustained responses were more likely to have higher CSiN scores than were patients with short-term progression (P = .015).

The investigators also compared the predictive power of CSiN with existing neoantigenicity metrics, ultimately concluding that CSiN was superior.

“Overall, the neoantigen load and neoantigen fitness models were not as strongly predictive of treatment response as CSiN,” the investigators wrote.

Again using data from patients with immunogenic cancers, the investigators looked for an association between CSiN score and overall survival. Indeed, patients with higher-than-average CSiN scores had significantly better survival than that of those with lower scores (P less than .001). This finding was maintained in a multivariate analysis that accounted for disease type, stage, sex, and age.

In contrast with the above findings, CSiN did not predict survival among patients with nonimmunogenic cancer types.

“Overall, our work offers a rigorous methodology of predicting response to immunotherapy and prognosis from routine patient samples and should be useful for personalizing medicine in the modern era of immunotherapy,” the investigators concluded.

The study was funded by the National Institutes of Health, the Cancer Prevention Research Institute of Texas, and the American Cancer Society. The investigators reported no conflicts of interest.

SOURCE: Lu et al. Sci Immunol. 2020 Feb 21. doi: 10.1126/sciimmunol.aaz3199.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM SCIENCE IMMUNOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

HBV: Surface antigen titer and ALT predict seroconversion

Article Type
Changed
Mon, 06/08/2020 - 16:30

Among patients with hepatitis B virus (HBV) infection who are not receiving antiviral therapy, surface antigen titers and alanine aminotransferase (ALT) levels may independently predict spontaneous seroconversion, based on a recent case-control study.

Patients with hepatitis B surface antigen (HBsAg) titers less than 1,000 IU/mL were significantly more likely to spontaneously seroconvert, reported principal author Sammy Saab, MD, of the University of California, Los Angeles, and colleagues.

While the predictive value of HBsAg titers has been demonstrated for patients undergoing antiviral therapy, data are limited for spontaneous seroconversion, the investigators wrote in Journal of Clinical Gastroenterology.

To learn more about this scenario, the investigators reviewed medical records from 2,126 patients who visited a large community practice in the Los Angeles area between 2014 and 2019. Cases were defined by HBV infection with seroconversion, whereas matched controls were defined by HBV without seroconversion. A variety of demographic and clinical data were also evaluated, including age, ethnicity, sex, HBsAg titer, ALT, HBV DNA, total cholesterol, presence of fatty liver, and other factors.

The investigators identified 167 patients with HBV who were not on antiviral therapy. Of these, 14 underwent seroconversion, and were matched with 70 patients who did not seroconvert. All patients were of Asian descent, most were women, and none had cirrhosis.

Across all demographic and clinical parameters, the two factors that significantly differed between cases and controls were ALT and HBsAg titer. The mean ALT for patients who seroconverted was 17.6 U/L, versus 25.1 U/L in those who did not undergo seroconversion (P less than .01). Similarly, mean titer was lower in the seroconversion group (459.8 vs. 782.0 IU/mL; P = .01).

The investigators noted that seroconversion was more common among patients with an HBsAg titer level less than 1,000 IU/mL. Specifically, 79% of patients who seroconverted had a titer less than 1,000 IU/mL, compared with just 16% of patients who did not seroconvert (P = .001).

HBV DNA levels were not predictive of seroconversion, the investigators noted, which aligns with most, but not all, previous research.

The investigators reported no disclosures.

SOURCE: Wu CF et al. J Clin Gastroenterol. 2020 Feb 11. doi: 10.1097/MCG.0000000000001324.

Publications
Topics
Sections

Among patients with hepatitis B virus (HBV) infection who are not receiving antiviral therapy, surface antigen titers and alanine aminotransferase (ALT) levels may independently predict spontaneous seroconversion, based on a recent case-control study.

Patients with hepatitis B surface antigen (HBsAg) titers less than 1,000 IU/mL were significantly more likely to spontaneously seroconvert, reported principal author Sammy Saab, MD, of the University of California, Los Angeles, and colleagues.

While the predictive value of HBsAg titers has been demonstrated for patients undergoing antiviral therapy, data are limited for spontaneous seroconversion, the investigators wrote in Journal of Clinical Gastroenterology.

To learn more about this scenario, the investigators reviewed medical records from 2,126 patients who visited a large community practice in the Los Angeles area between 2014 and 2019. Cases were defined by HBV infection with seroconversion, whereas matched controls were defined by HBV without seroconversion. A variety of demographic and clinical data were also evaluated, including age, ethnicity, sex, HBsAg titer, ALT, HBV DNA, total cholesterol, presence of fatty liver, and other factors.

The investigators identified 167 patients with HBV who were not on antiviral therapy. Of these, 14 underwent seroconversion, and were matched with 70 patients who did not seroconvert. All patients were of Asian descent, most were women, and none had cirrhosis.

Across all demographic and clinical parameters, the two factors that significantly differed between cases and controls were ALT and HBsAg titer. The mean ALT for patients who seroconverted was 17.6 U/L, versus 25.1 U/L in those who did not undergo seroconversion (P less than .01). Similarly, mean titer was lower in the seroconversion group (459.8 vs. 782.0 IU/mL; P = .01).

The investigators noted that seroconversion was more common among patients with an HBsAg titer level less than 1,000 IU/mL. Specifically, 79% of patients who seroconverted had a titer less than 1,000 IU/mL, compared with just 16% of patients who did not seroconvert (P = .001).

HBV DNA levels were not predictive of seroconversion, the investigators noted, which aligns with most, but not all, previous research.

The investigators reported no disclosures.

SOURCE: Wu CF et al. J Clin Gastroenterol. 2020 Feb 11. doi: 10.1097/MCG.0000000000001324.

Among patients with hepatitis B virus (HBV) infection who are not receiving antiviral therapy, surface antigen titers and alanine aminotransferase (ALT) levels may independently predict spontaneous seroconversion, based on a recent case-control study.

Patients with hepatitis B surface antigen (HBsAg) titers less than 1,000 IU/mL were significantly more likely to spontaneously seroconvert, reported principal author Sammy Saab, MD, of the University of California, Los Angeles, and colleagues.

While the predictive value of HBsAg titers has been demonstrated for patients undergoing antiviral therapy, data are limited for spontaneous seroconversion, the investigators wrote in Journal of Clinical Gastroenterology.

To learn more about this scenario, the investigators reviewed medical records from 2,126 patients who visited a large community practice in the Los Angeles area between 2014 and 2019. Cases were defined by HBV infection with seroconversion, whereas matched controls were defined by HBV without seroconversion. A variety of demographic and clinical data were also evaluated, including age, ethnicity, sex, HBsAg titer, ALT, HBV DNA, total cholesterol, presence of fatty liver, and other factors.

The investigators identified 167 patients with HBV who were not on antiviral therapy. Of these, 14 underwent seroconversion, and were matched with 70 patients who did not seroconvert. All patients were of Asian descent, most were women, and none had cirrhosis.

Across all demographic and clinical parameters, the two factors that significantly differed between cases and controls were ALT and HBsAg titer. The mean ALT for patients who seroconverted was 17.6 U/L, versus 25.1 U/L in those who did not undergo seroconversion (P less than .01). Similarly, mean titer was lower in the seroconversion group (459.8 vs. 782.0 IU/mL; P = .01).

The investigators noted that seroconversion was more common among patients with an HBsAg titer level less than 1,000 IU/mL. Specifically, 79% of patients who seroconverted had a titer less than 1,000 IU/mL, compared with just 16% of patients who did not seroconvert (P = .001).

HBV DNA levels were not predictive of seroconversion, the investigators noted, which aligns with most, but not all, previous research.

The investigators reported no disclosures.

SOURCE: Wu CF et al. J Clin Gastroenterol. 2020 Feb 11. doi: 10.1097/MCG.0000000000001324.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM JOURNAL OF CLINICAL GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
217536
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap