User login
Survey colonoscopy outpatients to flag high cancer risk
Point-of-care surveys on family history of colorectal cancer for individuals undergoing colonoscopy can identify people who would benefit from genetic evaluation, a study showed.
The feasibility and performance of two survey methods – one paper and one electronic – were evaluated to identify individuals at high genetic risk of colorectal cancer.
“Multiple studies have shown that family history assessments performed in primary care and in oncology and gastroenterology clinical settings are incomplete or inaccurate,” wrote Tannaz Guivatchian, MD, of the department of internal medicine at the University of Michigan Hospital, Ann Arbor, and coauthors. “There remains a need for targeted family history assessments to screen patients for hereditary cancer syndromes at point-of-care cancer screenings, such as colonoscopy.”
In the first cohort of the current study, a five-question paper survey was given to 600 patients after they had checked in for their colonoscopy, and the results were immediately given to the endoscopist performing the procedure. The second cohort of 100 patients took the paper survey and a more comprehensive tablet-based electronic survey (Gastrointest Endosc. 2017;86[4]:684-91).
The paper survey alone identified 60 colonoscopy patients (10%) as high risk because they met at least 1 of the 10 genetic referral criteria. The retrospective chart review 60 days after the procedure showed that 32 patients (5.3%) were referred for genetic evaluation, 31 of whom met at least 1 of the 10 criteria for referral.
Of the patients picked up by the paper survey, 21 (35%) had documentation of a genetic evaluation. Seven of these had germline mutations that predisposed them to cancer, 10 had undergone genetic testing that did not find any pathogenic mutations, and 4 did not undergo genetic testing; 3 were lost to follow-up and 1 was in hospice.
The research team also sought feedback from 21 endoscopists about the paper survey. The majority (85%) found the risk assessment tool helpful, with nearly three-quarters of them (71%) saying that it influenced their surveillance recommendations and 28.5% saying it prompted them to refer the patient for genetic evaluation.
In the second cohort, 9 of the 100 patients were found to be high risk but meeting at least 1 of the 10 criteria on the paper survey and/or achieving a PREMM1,2,6 score – a tool for assessing the likelihood of mutations associated with Lynch syndrome – of 5% or higher.
Of these nine patients, six were flagged for genetic evaluation based on the results of the paper survey, and three were picked up by the electronic survey. Three were referred for genetic evaluation.
An additional patient was also flagged for genetic evaluation after a review of the patient’s electronic medical record picked up information that the patient did not provide in either the paper or electronic survey.
In this second phase of the study, researchers found that only 73% of the patients approached were able to successfully complete the electronic survey before their procedure. The team had also mailed letters to 500 patients who invited them to complete the electronic survey at home before their colonoscopy appointment, but only two patients did so.
“Although several family history surveys and CRC [colorectal cancer] risk assessment tools have been published in the literature, operationalizing cancer risk assessments in busy clinical settings has been a consistent barrier to implementation,” the authors wrote. “Our results using both electronic and paper-based tools demonstrate that collection and review of family history information is feasible in the outpatient colonoscopy setting and provides physicians with information for CRC risk assessment that is immediately relevant to patient care.”
The authors stressed that while the short paper survey could be filled out quickly, and had a near 100% completion rate, the more comprehensive electronic survey provided a more complete family cancer history, which would help clinicians identify patients needing genetic evaluation. They also pointed out that each survey method identified patients not picked up by the other method.
The study was supported by the National Cancer Institute. No conflicts of interest were declared.
Point-of-care surveys on family history of colorectal cancer for individuals undergoing colonoscopy can identify people who would benefit from genetic evaluation, a study showed.
The feasibility and performance of two survey methods – one paper and one electronic – were evaluated to identify individuals at high genetic risk of colorectal cancer.
“Multiple studies have shown that family history assessments performed in primary care and in oncology and gastroenterology clinical settings are incomplete or inaccurate,” wrote Tannaz Guivatchian, MD, of the department of internal medicine at the University of Michigan Hospital, Ann Arbor, and coauthors. “There remains a need for targeted family history assessments to screen patients for hereditary cancer syndromes at point-of-care cancer screenings, such as colonoscopy.”
In the first cohort of the current study, a five-question paper survey was given to 600 patients after they had checked in for their colonoscopy, and the results were immediately given to the endoscopist performing the procedure. The second cohort of 100 patients took the paper survey and a more comprehensive tablet-based electronic survey (Gastrointest Endosc. 2017;86[4]:684-91).
The paper survey alone identified 60 colonoscopy patients (10%) as high risk because they met at least 1 of the 10 genetic referral criteria. The retrospective chart review 60 days after the procedure showed that 32 patients (5.3%) were referred for genetic evaluation, 31 of whom met at least 1 of the 10 criteria for referral.
Of the patients picked up by the paper survey, 21 (35%) had documentation of a genetic evaluation. Seven of these had germline mutations that predisposed them to cancer, 10 had undergone genetic testing that did not find any pathogenic mutations, and 4 did not undergo genetic testing; 3 were lost to follow-up and 1 was in hospice.
The research team also sought feedback from 21 endoscopists about the paper survey. The majority (85%) found the risk assessment tool helpful, with nearly three-quarters of them (71%) saying that it influenced their surveillance recommendations and 28.5% saying it prompted them to refer the patient for genetic evaluation.
In the second cohort, 9 of the 100 patients were found to be high risk but meeting at least 1 of the 10 criteria on the paper survey and/or achieving a PREMM1,2,6 score – a tool for assessing the likelihood of mutations associated with Lynch syndrome – of 5% or higher.
Of these nine patients, six were flagged for genetic evaluation based on the results of the paper survey, and three were picked up by the electronic survey. Three were referred for genetic evaluation.
An additional patient was also flagged for genetic evaluation after a review of the patient’s electronic medical record picked up information that the patient did not provide in either the paper or electronic survey.
In this second phase of the study, researchers found that only 73% of the patients approached were able to successfully complete the electronic survey before their procedure. The team had also mailed letters to 500 patients who invited them to complete the electronic survey at home before their colonoscopy appointment, but only two patients did so.
“Although several family history surveys and CRC [colorectal cancer] risk assessment tools have been published in the literature, operationalizing cancer risk assessments in busy clinical settings has been a consistent barrier to implementation,” the authors wrote. “Our results using both electronic and paper-based tools demonstrate that collection and review of family history information is feasible in the outpatient colonoscopy setting and provides physicians with information for CRC risk assessment that is immediately relevant to patient care.”
The authors stressed that while the short paper survey could be filled out quickly, and had a near 100% completion rate, the more comprehensive electronic survey provided a more complete family cancer history, which would help clinicians identify patients needing genetic evaluation. They also pointed out that each survey method identified patients not picked up by the other method.
The study was supported by the National Cancer Institute. No conflicts of interest were declared.
Point-of-care surveys on family history of colorectal cancer for individuals undergoing colonoscopy can identify people who would benefit from genetic evaluation, a study showed.
The feasibility and performance of two survey methods – one paper and one electronic – were evaluated to identify individuals at high genetic risk of colorectal cancer.
“Multiple studies have shown that family history assessments performed in primary care and in oncology and gastroenterology clinical settings are incomplete or inaccurate,” wrote Tannaz Guivatchian, MD, of the department of internal medicine at the University of Michigan Hospital, Ann Arbor, and coauthors. “There remains a need for targeted family history assessments to screen patients for hereditary cancer syndromes at point-of-care cancer screenings, such as colonoscopy.”
In the first cohort of the current study, a five-question paper survey was given to 600 patients after they had checked in for their colonoscopy, and the results were immediately given to the endoscopist performing the procedure. The second cohort of 100 patients took the paper survey and a more comprehensive tablet-based electronic survey (Gastrointest Endosc. 2017;86[4]:684-91).
The paper survey alone identified 60 colonoscopy patients (10%) as high risk because they met at least 1 of the 10 genetic referral criteria. The retrospective chart review 60 days after the procedure showed that 32 patients (5.3%) were referred for genetic evaluation, 31 of whom met at least 1 of the 10 criteria for referral.
Of the patients picked up by the paper survey, 21 (35%) had documentation of a genetic evaluation. Seven of these had germline mutations that predisposed them to cancer, 10 had undergone genetic testing that did not find any pathogenic mutations, and 4 did not undergo genetic testing; 3 were lost to follow-up and 1 was in hospice.
The research team also sought feedback from 21 endoscopists about the paper survey. The majority (85%) found the risk assessment tool helpful, with nearly three-quarters of them (71%) saying that it influenced their surveillance recommendations and 28.5% saying it prompted them to refer the patient for genetic evaluation.
In the second cohort, 9 of the 100 patients were found to be high risk but meeting at least 1 of the 10 criteria on the paper survey and/or achieving a PREMM1,2,6 score – a tool for assessing the likelihood of mutations associated with Lynch syndrome – of 5% or higher.
Of these nine patients, six were flagged for genetic evaluation based on the results of the paper survey, and three were picked up by the electronic survey. Three were referred for genetic evaluation.
An additional patient was also flagged for genetic evaluation after a review of the patient’s electronic medical record picked up information that the patient did not provide in either the paper or electronic survey.
In this second phase of the study, researchers found that only 73% of the patients approached were able to successfully complete the electronic survey before their procedure. The team had also mailed letters to 500 patients who invited them to complete the electronic survey at home before their colonoscopy appointment, but only two patients did so.
“Although several family history surveys and CRC [colorectal cancer] risk assessment tools have been published in the literature, operationalizing cancer risk assessments in busy clinical settings has been a consistent barrier to implementation,” the authors wrote. “Our results using both electronic and paper-based tools demonstrate that collection and review of family history information is feasible in the outpatient colonoscopy setting and provides physicians with information for CRC risk assessment that is immediately relevant to patient care.”
The authors stressed that while the short paper survey could be filled out quickly, and had a near 100% completion rate, the more comprehensive electronic survey provided a more complete family cancer history, which would help clinicians identify patients needing genetic evaluation. They also pointed out that each survey method identified patients not picked up by the other method.
The study was supported by the National Cancer Institute. No conflicts of interest were declared.
FROM GASTROINTESTINAL ENDOSCOPY
Key clinical point: Surveying colonoscopy patients about their family history before they undergo the procedure can help identify those who would benefit from genetic evaluation for colorectal cancer risk.
Major finding: A paper and electronic survey, administered at point of care, identified several patients at high familial risk of colorectal cancer.
Data source: A prospective study of two survey methods in 700 patients undergoing colonoscopies.
Disclosures: The study was supported by the National Cancer Institute. No conflicts of interest were declared.
Reminder calls to patients improve fecal test response
Automated and live phone calls were shown to improve patient return of fecal test samples for both English and Spanish speakers, based on the results of a pilot study.
Colorectal cancer (CRC) is the second deadliest cancer in the United States. Screening has been shown to be a very effective tool in decreasing the mortality and incidence of CRC, but screening rates are low with 63% of adults adhering to recommended screening schedules. This problem has been addressed by direct-mail fecal immunochemical testing (FIT) kits with associated reminders, but few studies have evaluated effectiveness of follow-up techniques on FIT return rates until this pilot study.
“While many direct-mail fecal testing programs have delivered patient reminders, ours is the first study to rigorously test the effectiveness of these reminders in a community health center population, and among patients with differing language preferences,” wrote Gloria Coronado, PhD, of the Center for Health Research at Kaiser Permanente and her colleagues.
The trial had two groups, one randomized and the other nonrandomized. Nonrandomized patients had active patient portals and received email reminders through the portal. The randomized group was sorted into seven intervention groups: Four of the groups used a unimodal contact method, and three groups used a multimodal contact method. The unimodal contact methods were letter reminders, automated call reminders, text reminders, and live call reminders. The multimodal contact methods were a reminder letter plus live call reminders, automated calls plus live call reminders, and text message reminders plus live call reminders. All written materials to contact patients were developed in English and later translated into Spanish and Russian. Phone call scripts were also developed in English and later translated into Spanish but not Russian because of a lack of Russian-speaking outreach workers.
After combining early-return FIT samples, those in the nonrandomized patient portal group, and the randomized samples, the overall return rate was 32.7%.
The method of contact for patients strongly influenced return rates for patients. Patients who received live phone calls were 50% more likely to return their FIT kit, compared with those who simply received a reminder email. Both English and Spanish speakers were much more likely to return their FIT kits if they were contacted with live or automated phone calls with odds ratios of 2.17 and 3.45, respectively. All other reminder techniques that did not include a phone call had similar completion rates to that of a reminder letter.
“Automated phone calls and text messages are the least costly options to implement, yet live reminders may allow staff to address or triage other patient health care needs,” they wrote.
Dr. Coronado was a coinvestigator for a study funded by Epigenomics. All other authors had no conflicts of interest to report.
AGA Resource
AGA offers education materials to help your patients better understand their colorectal cancer screening options. Learn more here.
Automated and live phone calls were shown to improve patient return of fecal test samples for both English and Spanish speakers, based on the results of a pilot study.
Colorectal cancer (CRC) is the second deadliest cancer in the United States. Screening has been shown to be a very effective tool in decreasing the mortality and incidence of CRC, but screening rates are low with 63% of adults adhering to recommended screening schedules. This problem has been addressed by direct-mail fecal immunochemical testing (FIT) kits with associated reminders, but few studies have evaluated effectiveness of follow-up techniques on FIT return rates until this pilot study.
“While many direct-mail fecal testing programs have delivered patient reminders, ours is the first study to rigorously test the effectiveness of these reminders in a community health center population, and among patients with differing language preferences,” wrote Gloria Coronado, PhD, of the Center for Health Research at Kaiser Permanente and her colleagues.
The trial had two groups, one randomized and the other nonrandomized. Nonrandomized patients had active patient portals and received email reminders through the portal. The randomized group was sorted into seven intervention groups: Four of the groups used a unimodal contact method, and three groups used a multimodal contact method. The unimodal contact methods were letter reminders, automated call reminders, text reminders, and live call reminders. The multimodal contact methods were a reminder letter plus live call reminders, automated calls plus live call reminders, and text message reminders plus live call reminders. All written materials to contact patients were developed in English and later translated into Spanish and Russian. Phone call scripts were also developed in English and later translated into Spanish but not Russian because of a lack of Russian-speaking outreach workers.
After combining early-return FIT samples, those in the nonrandomized patient portal group, and the randomized samples, the overall return rate was 32.7%.
The method of contact for patients strongly influenced return rates for patients. Patients who received live phone calls were 50% more likely to return their FIT kit, compared with those who simply received a reminder email. Both English and Spanish speakers were much more likely to return their FIT kits if they were contacted with live or automated phone calls with odds ratios of 2.17 and 3.45, respectively. All other reminder techniques that did not include a phone call had similar completion rates to that of a reminder letter.
“Automated phone calls and text messages are the least costly options to implement, yet live reminders may allow staff to address or triage other patient health care needs,” they wrote.
Dr. Coronado was a coinvestigator for a study funded by Epigenomics. All other authors had no conflicts of interest to report.
AGA Resource
AGA offers education materials to help your patients better understand their colorectal cancer screening options. Learn more here.
Automated and live phone calls were shown to improve patient return of fecal test samples for both English and Spanish speakers, based on the results of a pilot study.
Colorectal cancer (CRC) is the second deadliest cancer in the United States. Screening has been shown to be a very effective tool in decreasing the mortality and incidence of CRC, but screening rates are low with 63% of adults adhering to recommended screening schedules. This problem has been addressed by direct-mail fecal immunochemical testing (FIT) kits with associated reminders, but few studies have evaluated effectiveness of follow-up techniques on FIT return rates until this pilot study.
“While many direct-mail fecal testing programs have delivered patient reminders, ours is the first study to rigorously test the effectiveness of these reminders in a community health center population, and among patients with differing language preferences,” wrote Gloria Coronado, PhD, of the Center for Health Research at Kaiser Permanente and her colleagues.
The trial had two groups, one randomized and the other nonrandomized. Nonrandomized patients had active patient portals and received email reminders through the portal. The randomized group was sorted into seven intervention groups: Four of the groups used a unimodal contact method, and three groups used a multimodal contact method. The unimodal contact methods were letter reminders, automated call reminders, text reminders, and live call reminders. The multimodal contact methods were a reminder letter plus live call reminders, automated calls plus live call reminders, and text message reminders plus live call reminders. All written materials to contact patients were developed in English and later translated into Spanish and Russian. Phone call scripts were also developed in English and later translated into Spanish but not Russian because of a lack of Russian-speaking outreach workers.
After combining early-return FIT samples, those in the nonrandomized patient portal group, and the randomized samples, the overall return rate was 32.7%.
The method of contact for patients strongly influenced return rates for patients. Patients who received live phone calls were 50% more likely to return their FIT kit, compared with those who simply received a reminder email. Both English and Spanish speakers were much more likely to return their FIT kits if they were contacted with live or automated phone calls with odds ratios of 2.17 and 3.45, respectively. All other reminder techniques that did not include a phone call had similar completion rates to that of a reminder letter.
“Automated phone calls and text messages are the least costly options to implement, yet live reminders may allow staff to address or triage other patient health care needs,” they wrote.
Dr. Coronado was a coinvestigator for a study funded by Epigenomics. All other authors had no conflicts of interest to report.
AGA Resource
AGA offers education materials to help your patients better understand their colorectal cancer screening options. Learn more here.
FROM THE JOURNAL OF GENERAL INTERNAL MEDICINE
Ribaxamase reduced new CDI by 71%
SAN DIEGO – , results from a phase 2b study showed.
At an annual scientific meeting on infectious diseases, lead investigator John F. Kokai-Kun, PhD, said that the finding represents a paradigm shift in the use of intravenous beta-lactam antibiotics to prevent opportunistic infections. “We currently treat Clostridium difficile infection (CDI) with antibiotics, which attack the vegetative cells,” said Dr. Kokai-Kun, vice president of nonclinical affairs for Rockville, Md.–based Synthetic Biologics, which is developing ribaxamase. “Since C. diff. is primarily a toxin-mediated disease, certain products seem to neutralize the toxin. There’s also been work with probiotics and prebiotics to try to strengthen and repair the dysbiotic colon. Fecal replacement therapy has been shown to be fairly effective for treatment of recurrent C. diff. infection. What if we could simply block the initial insult that leads to this cascade? That’s the damage caused to the gut microbiome by the antibiotic that’s excreted to the intestine.”
That’s where ribaxamase comes in, he said at the combined annual meetings of the Infectious Diseases Society of America, the Society for Healthcare Epidemiology of America, the HIV Medicine Association, and the Pediatric Infectious Diseases Society. Ribaxamase is an orally administered beta-lac-tamase designed to degrade penicillin and cephalosporins in the intestinal lumen. It’s formulated for release in the proximal small intestine and is expected to be given during or a short time after administration of IV beta-lactam antibiotics such as ceftriaxone. “This is expected to degrade the excess antibiotics that are excreted into the small intestine via the bile,” Dr. Kokai-Kun explained. “It’s designed to prevent disruption of the gut microbiome and thus protect from opportunistic GI infections like CDI.”
Early-stage clinical studies demonstrated that ribaxamase was well tolerated and that it is not systemically absorbed, while phase 2 studies showed that ribaxamase degrades ceftriaxone in the intestine to below the level of detection while not affecting the pharmacokinetics of ceftriaxone in the plasma.
For the current study, 412 patients were enrolled at 84 multinational clinical sites. These patients were admitted to the hospital for treatment of a lower respiratory tract infection and were randomized 1:1 to receive ceftriaxone plus 150 mg ribaxamase or ceftriaxone plus placebo. Patients in both groups could also receive an oral macrolide at the discretion of the clinical investigator.
The researchers also obtained fecal samples at screening, 72 hours post antibiotic treatment, and at the end of a 4-week follow-up visit, to determine colonization by opportunistic pathogens and to examine changes in the gut microbiome.
Patients were monitored for 6 weeks for diarrhea and CDI. Diarrhea was defined as three or more loose or watery stools in a 24-hour period. “If that occurred, then we collected a sample, which was sent to the local lab to determine the presence of C. difficile toxins,” Dr. Kokai-Kun said.
The average age of study participants was 70 years, and about one-third in each arm received oral macrolides. The number of adverse events and serious adverse events were similar between active and placebo arms, and there was no trend associated with ribaxamase use. The lower respiratory tract infection cure rate to the ceftriaxone treatment was about 99% in both arms at 72 hours post treatment and at 2 weeks post treatment.
To analyze changes in the gut microbiome, the researchers conducted 16S rRNA sequencing of DNA extracted from fecal samples. In all, 652 samples were sequenced from 229 patients. Results from that analysis suggests that ribaxamase “appears to protect the gut microbiome from the onslaught of the ceftriaxone,” he said.
Ribaxamase reduced the incidence of new-onset CDI by 71%, compared with placebo (P = .045). “It apparently did this by protecting the integrity of the gut microbiome,” Dr. Kokai-Kun said. “There was also a significant reduction of new colonization by vancomycin-resistant enterococci at 72 hours and 4 weeks (P = .0001 and P = .0002, respectively) which is an opportunistic pathogen that is known to be able to inhabit gut microbiome when there is dysbiosis.”
The study was sponsored by Synthetic Biologics. Dr. Kokai-Kun is an employee of the company.
AGA Resource
Help your patients better understand C. difficile by using AGA’s patient education materials available here.
[email protected]
SAN DIEGO – , results from a phase 2b study showed.
At an annual scientific meeting on infectious diseases, lead investigator John F. Kokai-Kun, PhD, said that the finding represents a paradigm shift in the use of intravenous beta-lactam antibiotics to prevent opportunistic infections. “We currently treat Clostridium difficile infection (CDI) with antibiotics, which attack the vegetative cells,” said Dr. Kokai-Kun, vice president of nonclinical affairs for Rockville, Md.–based Synthetic Biologics, which is developing ribaxamase. “Since C. diff. is primarily a toxin-mediated disease, certain products seem to neutralize the toxin. There’s also been work with probiotics and prebiotics to try to strengthen and repair the dysbiotic colon. Fecal replacement therapy has been shown to be fairly effective for treatment of recurrent C. diff. infection. What if we could simply block the initial insult that leads to this cascade? That’s the damage caused to the gut microbiome by the antibiotic that’s excreted to the intestine.”
That’s where ribaxamase comes in, he said at the combined annual meetings of the Infectious Diseases Society of America, the Society for Healthcare Epidemiology of America, the HIV Medicine Association, and the Pediatric Infectious Diseases Society. Ribaxamase is an orally administered beta-lac-tamase designed to degrade penicillin and cephalosporins in the intestinal lumen. It’s formulated for release in the proximal small intestine and is expected to be given during or a short time after administration of IV beta-lactam antibiotics such as ceftriaxone. “This is expected to degrade the excess antibiotics that are excreted into the small intestine via the bile,” Dr. Kokai-Kun explained. “It’s designed to prevent disruption of the gut microbiome and thus protect from opportunistic GI infections like CDI.”
Early-stage clinical studies demonstrated that ribaxamase was well tolerated and that it is not systemically absorbed, while phase 2 studies showed that ribaxamase degrades ceftriaxone in the intestine to below the level of detection while not affecting the pharmacokinetics of ceftriaxone in the plasma.
For the current study, 412 patients were enrolled at 84 multinational clinical sites. These patients were admitted to the hospital for treatment of a lower respiratory tract infection and were randomized 1:1 to receive ceftriaxone plus 150 mg ribaxamase or ceftriaxone plus placebo. Patients in both groups could also receive an oral macrolide at the discretion of the clinical investigator.
The researchers also obtained fecal samples at screening, 72 hours post antibiotic treatment, and at the end of a 4-week follow-up visit, to determine colonization by opportunistic pathogens and to examine changes in the gut microbiome.
Patients were monitored for 6 weeks for diarrhea and CDI. Diarrhea was defined as three or more loose or watery stools in a 24-hour period. “If that occurred, then we collected a sample, which was sent to the local lab to determine the presence of C. difficile toxins,” Dr. Kokai-Kun said.
The average age of study participants was 70 years, and about one-third in each arm received oral macrolides. The number of adverse events and serious adverse events were similar between active and placebo arms, and there was no trend associated with ribaxamase use. The lower respiratory tract infection cure rate to the ceftriaxone treatment was about 99% in both arms at 72 hours post treatment and at 2 weeks post treatment.
To analyze changes in the gut microbiome, the researchers conducted 16S rRNA sequencing of DNA extracted from fecal samples. In all, 652 samples were sequenced from 229 patients. Results from that analysis suggests that ribaxamase “appears to protect the gut microbiome from the onslaught of the ceftriaxone,” he said.
Ribaxamase reduced the incidence of new-onset CDI by 71%, compared with placebo (P = .045). “It apparently did this by protecting the integrity of the gut microbiome,” Dr. Kokai-Kun said. “There was also a significant reduction of new colonization by vancomycin-resistant enterococci at 72 hours and 4 weeks (P = .0001 and P = .0002, respectively) which is an opportunistic pathogen that is known to be able to inhabit gut microbiome when there is dysbiosis.”
The study was sponsored by Synthetic Biologics. Dr. Kokai-Kun is an employee of the company.
AGA Resource
Help your patients better understand C. difficile by using AGA’s patient education materials available here.
[email protected]
SAN DIEGO – , results from a phase 2b study showed.
At an annual scientific meeting on infectious diseases, lead investigator John F. Kokai-Kun, PhD, said that the finding represents a paradigm shift in the use of intravenous beta-lactam antibiotics to prevent opportunistic infections. “We currently treat Clostridium difficile infection (CDI) with antibiotics, which attack the vegetative cells,” said Dr. Kokai-Kun, vice president of nonclinical affairs for Rockville, Md.–based Synthetic Biologics, which is developing ribaxamase. “Since C. diff. is primarily a toxin-mediated disease, certain products seem to neutralize the toxin. There’s also been work with probiotics and prebiotics to try to strengthen and repair the dysbiotic colon. Fecal replacement therapy has been shown to be fairly effective for treatment of recurrent C. diff. infection. What if we could simply block the initial insult that leads to this cascade? That’s the damage caused to the gut microbiome by the antibiotic that’s excreted to the intestine.”
That’s where ribaxamase comes in, he said at the combined annual meetings of the Infectious Diseases Society of America, the Society for Healthcare Epidemiology of America, the HIV Medicine Association, and the Pediatric Infectious Diseases Society. Ribaxamase is an orally administered beta-lac-tamase designed to degrade penicillin and cephalosporins in the intestinal lumen. It’s formulated for release in the proximal small intestine and is expected to be given during or a short time after administration of IV beta-lactam antibiotics such as ceftriaxone. “This is expected to degrade the excess antibiotics that are excreted into the small intestine via the bile,” Dr. Kokai-Kun explained. “It’s designed to prevent disruption of the gut microbiome and thus protect from opportunistic GI infections like CDI.”
Early-stage clinical studies demonstrated that ribaxamase was well tolerated and that it is not systemically absorbed, while phase 2 studies showed that ribaxamase degrades ceftriaxone in the intestine to below the level of detection while not affecting the pharmacokinetics of ceftriaxone in the plasma.
For the current study, 412 patients were enrolled at 84 multinational clinical sites. These patients were admitted to the hospital for treatment of a lower respiratory tract infection and were randomized 1:1 to receive ceftriaxone plus 150 mg ribaxamase or ceftriaxone plus placebo. Patients in both groups could also receive an oral macrolide at the discretion of the clinical investigator.
The researchers also obtained fecal samples at screening, 72 hours post antibiotic treatment, and at the end of a 4-week follow-up visit, to determine colonization by opportunistic pathogens and to examine changes in the gut microbiome.
Patients were monitored for 6 weeks for diarrhea and CDI. Diarrhea was defined as three or more loose or watery stools in a 24-hour period. “If that occurred, then we collected a sample, which was sent to the local lab to determine the presence of C. difficile toxins,” Dr. Kokai-Kun said.
The average age of study participants was 70 years, and about one-third in each arm received oral macrolides. The number of adverse events and serious adverse events were similar between active and placebo arms, and there was no trend associated with ribaxamase use. The lower respiratory tract infection cure rate to the ceftriaxone treatment was about 99% in both arms at 72 hours post treatment and at 2 weeks post treatment.
To analyze changes in the gut microbiome, the researchers conducted 16S rRNA sequencing of DNA extracted from fecal samples. In all, 652 samples were sequenced from 229 patients. Results from that analysis suggests that ribaxamase “appears to protect the gut microbiome from the onslaught of the ceftriaxone,” he said.
Ribaxamase reduced the incidence of new-onset CDI by 71%, compared with placebo (P = .045). “It apparently did this by protecting the integrity of the gut microbiome,” Dr. Kokai-Kun said. “There was also a significant reduction of new colonization by vancomycin-resistant enterococci at 72 hours and 4 weeks (P = .0001 and P = .0002, respectively) which is an opportunistic pathogen that is known to be able to inhabit gut microbiome when there is dysbiosis.”
The study was sponsored by Synthetic Biologics. Dr. Kokai-Kun is an employee of the company.
AGA Resource
Help your patients better understand C. difficile by using AGA’s patient education materials available here.
[email protected]
Golimumab earns new FDA approvals
The U.S. Food and Drug Administration has approved golimumab (Simponi Aria) for use in adults with active psoriatic arthritis (PsA) or active ankylosing spondylitis (AS).
Simponi Aria is an intravenous formulation of golimumab that is already approved for moderate to severe rheumatoid arthritis. The subcutaneous injection formulation of golimumab, Simponi, is already approved for RA, PsA, AS, and ulcerative colitis. Golimumab is a fully human anti–tumor necrosis factor-alpha therapy, and the intravenous formulation is designed for use as a 30-minute infusion.
“In the study for the treatment of active PsA, patients experienced improvement in joint symptoms and inhibition of structural damage. In the study for treatment of active AS, results showed improvement in measures of disease activity,” according to an Oct. 20 announcement from the manufacturer of golimumab, Janssen Biotech.
Read the revised prescribing information for Simponi Aria here.
The U.S. Food and Drug Administration has approved golimumab (Simponi Aria) for use in adults with active psoriatic arthritis (PsA) or active ankylosing spondylitis (AS).
Simponi Aria is an intravenous formulation of golimumab that is already approved for moderate to severe rheumatoid arthritis. The subcutaneous injection formulation of golimumab, Simponi, is already approved for RA, PsA, AS, and ulcerative colitis. Golimumab is a fully human anti–tumor necrosis factor-alpha therapy, and the intravenous formulation is designed for use as a 30-minute infusion.
“In the study for the treatment of active PsA, patients experienced improvement in joint symptoms and inhibition of structural damage. In the study for treatment of active AS, results showed improvement in measures of disease activity,” according to an Oct. 20 announcement from the manufacturer of golimumab, Janssen Biotech.
Read the revised prescribing information for Simponi Aria here.
The U.S. Food and Drug Administration has approved golimumab (Simponi Aria) for use in adults with active psoriatic arthritis (PsA) or active ankylosing spondylitis (AS).
Simponi Aria is an intravenous formulation of golimumab that is already approved for moderate to severe rheumatoid arthritis. The subcutaneous injection formulation of golimumab, Simponi, is already approved for RA, PsA, AS, and ulcerative colitis. Golimumab is a fully human anti–tumor necrosis factor-alpha therapy, and the intravenous formulation is designed for use as a 30-minute infusion.
“In the study for the treatment of active PsA, patients experienced improvement in joint symptoms and inhibition of structural damage. In the study for treatment of active AS, results showed improvement in measures of disease activity,” according to an Oct. 20 announcement from the manufacturer of golimumab, Janssen Biotech.
Read the revised prescribing information for Simponi Aria here.
Carvedilol fails to reduce variceal bleeds in acute-on-chronic liver failure
WASHINGTON – Treatment with carvedilol reduced the incidence of sepsis and acute kidney injury and improved survival at 28 days but did not significantly reduce the progression of esophageal varices in patients with acute-on-chronic liver failure.
A total of 136 patients with acute-on-chronic liver failure with small or no esophageal varices and a hepatic venous pressure gradient (HVPG) of 12 mm Hg or greater were enrolled in a single center, prospective, open-label, randomized controlled trial: 66 were randomized to carvedilol and 70 to placebo, according to Sumeet Kainth, MD, of the Institute of Liver and Biliary Sciences in New Delhi.
More than 90% of patients were men with a mean age of 44 years, and composition of the treatment and placebo groups was similar. About 70% in each group had alcoholic hepatitis (the reason for acute liver failure in most). Mean Model for End-Stage Liver Disease (MELD) scores were about 25. Hemodynamic parameters also were comparable, with a mean HVPG of about 19, Dr. Kainth said at the annual meeting of the American Association for the Study of Liver Diseases.
Patients in the treatment group received a median maximum tolerated dose of carvedilol of 12.5 mg, with a range of 3.13 mg to 25 mg.
Morbidity and mortality were high, as is expected with acute-on-chronic liver failure, he noted. A total of 36 patients died before the end of the 90-day study period. Another 23 experienced adverse events and 2 progressed to liver transplant.
HVPG at 90 days decreased significantly in both groups. In the carvedilol group, 90-day HVPG was 16 mm Hg, compared with 19.7 mm Hg at baseline (P less than .01). For placebo patients, 90-day HVPG spontaneously improved to 14.8 mm Hg, compared with a baseline of 17.2 mm Hg (P less than .01).
Carvedilol did not significantly slow the development or growth of varices, however, Dr. Kainth said. At 90 days, varices had progressed in 9 of 40 patients (22.5%) of patients on carvedilol and 8 of 31 (25.8%) of placebo patients.
Significantly fewer patients in the carvedilol group developed acute kidney injury at 28 days (14% vs. 38% on placebo) and sepsis (5% vs. 20%). Mortality also was reduced significantly at 28 days (11% vs. 24%), he reported.
Treatment with carvedilol did not achieve significant reductions in variceal bleeding, “possibly due to the low number of bleeds seen in the study [because of] the exclusion of patients with large varices,” Dr. Kainth said.
The study was sponsored by Institute of Liver and Biliary Sciences. Dr. Kainth reported no relevant conflicts of interest.
[email protected]
On Twitter @denisefulton
WASHINGTON – Treatment with carvedilol reduced the incidence of sepsis and acute kidney injury and improved survival at 28 days but did not significantly reduce the progression of esophageal varices in patients with acute-on-chronic liver failure.
A total of 136 patients with acute-on-chronic liver failure with small or no esophageal varices and a hepatic venous pressure gradient (HVPG) of 12 mm Hg or greater were enrolled in a single center, prospective, open-label, randomized controlled trial: 66 were randomized to carvedilol and 70 to placebo, according to Sumeet Kainth, MD, of the Institute of Liver and Biliary Sciences in New Delhi.
More than 90% of patients were men with a mean age of 44 years, and composition of the treatment and placebo groups was similar. About 70% in each group had alcoholic hepatitis (the reason for acute liver failure in most). Mean Model for End-Stage Liver Disease (MELD) scores were about 25. Hemodynamic parameters also were comparable, with a mean HVPG of about 19, Dr. Kainth said at the annual meeting of the American Association for the Study of Liver Diseases.
Patients in the treatment group received a median maximum tolerated dose of carvedilol of 12.5 mg, with a range of 3.13 mg to 25 mg.
Morbidity and mortality were high, as is expected with acute-on-chronic liver failure, he noted. A total of 36 patients died before the end of the 90-day study period. Another 23 experienced adverse events and 2 progressed to liver transplant.
HVPG at 90 days decreased significantly in both groups. In the carvedilol group, 90-day HVPG was 16 mm Hg, compared with 19.7 mm Hg at baseline (P less than .01). For placebo patients, 90-day HVPG spontaneously improved to 14.8 mm Hg, compared with a baseline of 17.2 mm Hg (P less than .01).
Carvedilol did not significantly slow the development or growth of varices, however, Dr. Kainth said. At 90 days, varices had progressed in 9 of 40 patients (22.5%) of patients on carvedilol and 8 of 31 (25.8%) of placebo patients.
Significantly fewer patients in the carvedilol group developed acute kidney injury at 28 days (14% vs. 38% on placebo) and sepsis (5% vs. 20%). Mortality also was reduced significantly at 28 days (11% vs. 24%), he reported.
Treatment with carvedilol did not achieve significant reductions in variceal bleeding, “possibly due to the low number of bleeds seen in the study [because of] the exclusion of patients with large varices,” Dr. Kainth said.
The study was sponsored by Institute of Liver and Biliary Sciences. Dr. Kainth reported no relevant conflicts of interest.
[email protected]
On Twitter @denisefulton
WASHINGTON – Treatment with carvedilol reduced the incidence of sepsis and acute kidney injury and improved survival at 28 days but did not significantly reduce the progression of esophageal varices in patients with acute-on-chronic liver failure.
A total of 136 patients with acute-on-chronic liver failure with small or no esophageal varices and a hepatic venous pressure gradient (HVPG) of 12 mm Hg or greater were enrolled in a single center, prospective, open-label, randomized controlled trial: 66 were randomized to carvedilol and 70 to placebo, according to Sumeet Kainth, MD, of the Institute of Liver and Biliary Sciences in New Delhi.
More than 90% of patients were men with a mean age of 44 years, and composition of the treatment and placebo groups was similar. About 70% in each group had alcoholic hepatitis (the reason for acute liver failure in most). Mean Model for End-Stage Liver Disease (MELD) scores were about 25. Hemodynamic parameters also were comparable, with a mean HVPG of about 19, Dr. Kainth said at the annual meeting of the American Association for the Study of Liver Diseases.
Patients in the treatment group received a median maximum tolerated dose of carvedilol of 12.5 mg, with a range of 3.13 mg to 25 mg.
Morbidity and mortality were high, as is expected with acute-on-chronic liver failure, he noted. A total of 36 patients died before the end of the 90-day study period. Another 23 experienced adverse events and 2 progressed to liver transplant.
HVPG at 90 days decreased significantly in both groups. In the carvedilol group, 90-day HVPG was 16 mm Hg, compared with 19.7 mm Hg at baseline (P less than .01). For placebo patients, 90-day HVPG spontaneously improved to 14.8 mm Hg, compared with a baseline of 17.2 mm Hg (P less than .01).
Carvedilol did not significantly slow the development or growth of varices, however, Dr. Kainth said. At 90 days, varices had progressed in 9 of 40 patients (22.5%) of patients on carvedilol and 8 of 31 (25.8%) of placebo patients.
Significantly fewer patients in the carvedilol group developed acute kidney injury at 28 days (14% vs. 38% on placebo) and sepsis (5% vs. 20%). Mortality also was reduced significantly at 28 days (11% vs. 24%), he reported.
Treatment with carvedilol did not achieve significant reductions in variceal bleeding, “possibly due to the low number of bleeds seen in the study [because of] the exclusion of patients with large varices,” Dr. Kainth said.
The study was sponsored by Institute of Liver and Biliary Sciences. Dr. Kainth reported no relevant conflicts of interest.
[email protected]
On Twitter @denisefulton
AT THE LIVER MEETING 2017
Key clinical point:
Major finding: At 90 days, varices had progressed in 9 of 40 (22.5%) patients on carvedilol vs. 8 of 31 (25.8%) of placebo patients.
Data source: A single-center, prospective, open-label, randomized controlled trial of 136 patients with acute-on-chronic liver failure.
Disclosures: The study was sponsored by the Institute of Liver and Biliary Sciences. Dr. Kainth reported no relevant conflicts of interest.
In high-risk patients, methylation strikes genes before psychosis hits
BERLIN – Researchers are honing in on several sets of genes that, when altered by as-yet-unknown factors, may signal conversion to full-blown psychosis in people at ultrahigh risk for the disorder.
If confirmed, these candidate markers might have potential as blood-based biomarkers to predict conversion risk and assist in clinical staging, Marie-Odile Krebs, MD, PhD, said at the meeting of the World Psychiatric Association.
The genes modulate three biologic pathways that also have been implicated in schizophrenia: glutathione metabolism, axonal targeting, and inflammation, said Dr. Krebs of Saint-Anne Hospital, Paris. “Knowing this may even help us to target some drugs that work in those pathways,” she said.
Several blood-based analyte screens have been investigated with mixed results, Dr. Krebs noted.
In 2015, researchers at the University of North Carolina at Chapel Hill, and Harvard Medical School, Boston, created a 15-analyte plasma panel that performed well in the North American Prodrome Longitudinal Study (NAPL-S) cohort. The project is a multisite endeavor that aims to better understand predictors and mechanisms for the development of psychosis. The panel separated 35 unaffected controls from 32 with high-risk symptoms who converted to psychosis and from 40 who did not, with an area under the curve (AUC) of 0.91 (Schizophr Bull. 2015 Mar;41[2]:419-28).
Selected from an initial group of 185 analytes, the candidate markers were inflammatory cytokines, proteins modulating blood-brain barrier inflammation, and hormones related to the hypothalamic-pituitary axes. Several also were involved in reacting to oxidative stress.
Earlier this year, members of that same group identified a set of nine microRNAs related to cortical thinning in patients who converted to psychosis. These microRNAs also have been implicated in brain development, synaptic plasticity, immune function, and schizophrenia (Neuropsychopharmacology. 2017 Feb 10. doi: 10.1038/npp.2017.34).
Although these studies are helpful signposts, Dr. Krebs said they do not reflect the dynamic interaction of disease risk, which includes not only the intrinsic factors of genetics, enzymes, and proteins, but the extrinsic risks imposed by other factors: stress, trauma, cannabis use, and other completely individual experiences. “This is a dynamic process, and we need a dynamic assessment,” she said.
To that end, Dr. Krebs and her colleagues decided to look at methylomic changes in a small group of 39 patients at ultrahigh risk for psychosis conversion. All of these patients (mean age, 22 years) were seen at Saint-Anne Hospital from 2009 to 2013. Using whole blood, Dr. Krebs performed a genomewide DNA methylation study to determine what genes – if any – were differently methylated between the converters and nonconverters. The mean follow-up was 1 year (Mol Psychiatry. 2017 Apr;22[4]:512-8).
Although no significant difference was found in global methylation associated with conversion, Dr. Krebs did find longitudinal changes associated with conversion in three regions.
A cluster of five genes in the glutathione S-transferase family was differently methylated between the converters and nonconverters. Two were related to the GSTM5 promoter gene, which encodes for cytosolic and membrane-bound glutathione S-transferase – an important antioxidant enzyme, the downregulation of which has been implicated in schizophrenia. These two regions appeared to be stable over time, suggesting that methylation occurred before conversion, Dr. Krebs said.
Oxidative stress has been implicated in schizophrenia, and GSTM5 is expressed in the brain, Dr. Krebs noted. Some researchers suggest the gene is involved in dopamine metabolism. It’s also underexpressed in the prefrontal cortex of schizophrenia patients.
Three other regions in the GST family changed with conversion: two on the glutathione S-transferase theta 1 gene and one on the glutathione S-transferase P gene. Since all of these have to do with production of the innate antioxidant glutathione, “these findings suggest a potential use for antioxidant drugs,” Dr. Krebs said.
She found two other differently methylated regions as well.
One was a cluster of eight genes that are all involved in axon guidance – the process by which axons branch out to their correct targets. The second cluster comprised seven genes, all of which are involved in regulating interleukin-17 signaling. This cytokine has been implicated in autoimmune disorders.
Finally, Dr. Krebs performed a transcriptome analysis looking at the brain-expressed messenger RNA in the samples. “The methylome seemed less dynamic than the transcriptome,” she said. “Some methylomic changes may have occurred several months before the conversion, whereas transcriptomic analysis may reflect more rapid changes.”
There was only a 22% concordance between the two analyses. However, the GSTM5 gene and the neuropilin 1 gene – one of those involved in axon guidance – were both methylated and downregulated in the converters. The transcriptome analysis also found significantly decreased expression (although not methylation) of another gene, carnitine palmitoyltransferase 1A. This is a key enzyme in oxidizing long-chain fatty acids and transporting them into the mitochondria.
Adapting these observed differences in gene expression into a useful clinical tool will be challenging, Dr. Krebs said. In addition to large-group validation, any risk prediction model would have to take into account the many other factors that influence psychosis conversion: cerebral and sexual maturation during adolescence, cannabis use, and stress and other completely individual life experiences.
Nevertheless, she concluded, “longitudinal ‘multi-omics’ may be a step toward a future of personalized molecular psychiatry.”
Dr. Krebs had no relevant financial disclosures.
[email protected]
On Twitter @alz_gal
BERLIN – Researchers are honing in on several sets of genes that, when altered by as-yet-unknown factors, may signal conversion to full-blown psychosis in people at ultrahigh risk for the disorder.
If confirmed, these candidate markers might have potential as blood-based biomarkers to predict conversion risk and assist in clinical staging, Marie-Odile Krebs, MD, PhD, said at the meeting of the World Psychiatric Association.
The genes modulate three biologic pathways that also have been implicated in schizophrenia: glutathione metabolism, axonal targeting, and inflammation, said Dr. Krebs of Saint-Anne Hospital, Paris. “Knowing this may even help us to target some drugs that work in those pathways,” she said.
Several blood-based analyte screens have been investigated with mixed results, Dr. Krebs noted.
In 2015, researchers at the University of North Carolina at Chapel Hill, and Harvard Medical School, Boston, created a 15-analyte plasma panel that performed well in the North American Prodrome Longitudinal Study (NAPL-S) cohort. The project is a multisite endeavor that aims to better understand predictors and mechanisms for the development of psychosis. The panel separated 35 unaffected controls from 32 with high-risk symptoms who converted to psychosis and from 40 who did not, with an area under the curve (AUC) of 0.91 (Schizophr Bull. 2015 Mar;41[2]:419-28).
Selected from an initial group of 185 analytes, the candidate markers were inflammatory cytokines, proteins modulating blood-brain barrier inflammation, and hormones related to the hypothalamic-pituitary axes. Several also were involved in reacting to oxidative stress.
Earlier this year, members of that same group identified a set of nine microRNAs related to cortical thinning in patients who converted to psychosis. These microRNAs also have been implicated in brain development, synaptic plasticity, immune function, and schizophrenia (Neuropsychopharmacology. 2017 Feb 10. doi: 10.1038/npp.2017.34).
Although these studies are helpful signposts, Dr. Krebs said they do not reflect the dynamic interaction of disease risk, which includes not only the intrinsic factors of genetics, enzymes, and proteins, but the extrinsic risks imposed by other factors: stress, trauma, cannabis use, and other completely individual experiences. “This is a dynamic process, and we need a dynamic assessment,” she said.
To that end, Dr. Krebs and her colleagues decided to look at methylomic changes in a small group of 39 patients at ultrahigh risk for psychosis conversion. All of these patients (mean age, 22 years) were seen at Saint-Anne Hospital from 2009 to 2013. Using whole blood, Dr. Krebs performed a genomewide DNA methylation study to determine what genes – if any – were differently methylated between the converters and nonconverters. The mean follow-up was 1 year (Mol Psychiatry. 2017 Apr;22[4]:512-8).
Although no significant difference was found in global methylation associated with conversion, Dr. Krebs did find longitudinal changes associated with conversion in three regions.
A cluster of five genes in the glutathione S-transferase family was differently methylated between the converters and nonconverters. Two were related to the GSTM5 promoter gene, which encodes for cytosolic and membrane-bound glutathione S-transferase – an important antioxidant enzyme, the downregulation of which has been implicated in schizophrenia. These two regions appeared to be stable over time, suggesting that methylation occurred before conversion, Dr. Krebs said.
Oxidative stress has been implicated in schizophrenia, and GSTM5 is expressed in the brain, Dr. Krebs noted. Some researchers suggest the gene is involved in dopamine metabolism. It’s also underexpressed in the prefrontal cortex of schizophrenia patients.
Three other regions in the GST family changed with conversion: two on the glutathione S-transferase theta 1 gene and one on the glutathione S-transferase P gene. Since all of these have to do with production of the innate antioxidant glutathione, “these findings suggest a potential use for antioxidant drugs,” Dr. Krebs said.
She found two other differently methylated regions as well.
One was a cluster of eight genes that are all involved in axon guidance – the process by which axons branch out to their correct targets. The second cluster comprised seven genes, all of which are involved in regulating interleukin-17 signaling. This cytokine has been implicated in autoimmune disorders.
Finally, Dr. Krebs performed a transcriptome analysis looking at the brain-expressed messenger RNA in the samples. “The methylome seemed less dynamic than the transcriptome,” she said. “Some methylomic changes may have occurred several months before the conversion, whereas transcriptomic analysis may reflect more rapid changes.”
There was only a 22% concordance between the two analyses. However, the GSTM5 gene and the neuropilin 1 gene – one of those involved in axon guidance – were both methylated and downregulated in the converters. The transcriptome analysis also found significantly decreased expression (although not methylation) of another gene, carnitine palmitoyltransferase 1A. This is a key enzyme in oxidizing long-chain fatty acids and transporting them into the mitochondria.
Adapting these observed differences in gene expression into a useful clinical tool will be challenging, Dr. Krebs said. In addition to large-group validation, any risk prediction model would have to take into account the many other factors that influence psychosis conversion: cerebral and sexual maturation during adolescence, cannabis use, and stress and other completely individual life experiences.
Nevertheless, she concluded, “longitudinal ‘multi-omics’ may be a step toward a future of personalized molecular psychiatry.”
Dr. Krebs had no relevant financial disclosures.
[email protected]
On Twitter @alz_gal
BERLIN – Researchers are honing in on several sets of genes that, when altered by as-yet-unknown factors, may signal conversion to full-blown psychosis in people at ultrahigh risk for the disorder.
If confirmed, these candidate markers might have potential as blood-based biomarkers to predict conversion risk and assist in clinical staging, Marie-Odile Krebs, MD, PhD, said at the meeting of the World Psychiatric Association.
The genes modulate three biologic pathways that also have been implicated in schizophrenia: glutathione metabolism, axonal targeting, and inflammation, said Dr. Krebs of Saint-Anne Hospital, Paris. “Knowing this may even help us to target some drugs that work in those pathways,” she said.
Several blood-based analyte screens have been investigated with mixed results, Dr. Krebs noted.
In 2015, researchers at the University of North Carolina at Chapel Hill, and Harvard Medical School, Boston, created a 15-analyte plasma panel that performed well in the North American Prodrome Longitudinal Study (NAPL-S) cohort. The project is a multisite endeavor that aims to better understand predictors and mechanisms for the development of psychosis. The panel separated 35 unaffected controls from 32 with high-risk symptoms who converted to psychosis and from 40 who did not, with an area under the curve (AUC) of 0.91 (Schizophr Bull. 2015 Mar;41[2]:419-28).
Selected from an initial group of 185 analytes, the candidate markers were inflammatory cytokines, proteins modulating blood-brain barrier inflammation, and hormones related to the hypothalamic-pituitary axes. Several also were involved in reacting to oxidative stress.
Earlier this year, members of that same group identified a set of nine microRNAs related to cortical thinning in patients who converted to psychosis. These microRNAs also have been implicated in brain development, synaptic plasticity, immune function, and schizophrenia (Neuropsychopharmacology. 2017 Feb 10. doi: 10.1038/npp.2017.34).
Although these studies are helpful signposts, Dr. Krebs said they do not reflect the dynamic interaction of disease risk, which includes not only the intrinsic factors of genetics, enzymes, and proteins, but the extrinsic risks imposed by other factors: stress, trauma, cannabis use, and other completely individual experiences. “This is a dynamic process, and we need a dynamic assessment,” she said.
To that end, Dr. Krebs and her colleagues decided to look at methylomic changes in a small group of 39 patients at ultrahigh risk for psychosis conversion. All of these patients (mean age, 22 years) were seen at Saint-Anne Hospital from 2009 to 2013. Using whole blood, Dr. Krebs performed a genomewide DNA methylation study to determine what genes – if any – were differently methylated between the converters and nonconverters. The mean follow-up was 1 year (Mol Psychiatry. 2017 Apr;22[4]:512-8).
Although no significant difference was found in global methylation associated with conversion, Dr. Krebs did find longitudinal changes associated with conversion in three regions.
A cluster of five genes in the glutathione S-transferase family was differently methylated between the converters and nonconverters. Two were related to the GSTM5 promoter gene, which encodes for cytosolic and membrane-bound glutathione S-transferase – an important antioxidant enzyme, the downregulation of which has been implicated in schizophrenia. These two regions appeared to be stable over time, suggesting that methylation occurred before conversion, Dr. Krebs said.
Oxidative stress has been implicated in schizophrenia, and GSTM5 is expressed in the brain, Dr. Krebs noted. Some researchers suggest the gene is involved in dopamine metabolism. It’s also underexpressed in the prefrontal cortex of schizophrenia patients.
Three other regions in the GST family changed with conversion: two on the glutathione S-transferase theta 1 gene and one on the glutathione S-transferase P gene. Since all of these have to do with production of the innate antioxidant glutathione, “these findings suggest a potential use for antioxidant drugs,” Dr. Krebs said.
She found two other differently methylated regions as well.
One was a cluster of eight genes that are all involved in axon guidance – the process by which axons branch out to their correct targets. The second cluster comprised seven genes, all of which are involved in regulating interleukin-17 signaling. This cytokine has been implicated in autoimmune disorders.
Finally, Dr. Krebs performed a transcriptome analysis looking at the brain-expressed messenger RNA in the samples. “The methylome seemed less dynamic than the transcriptome,” she said. “Some methylomic changes may have occurred several months before the conversion, whereas transcriptomic analysis may reflect more rapid changes.”
There was only a 22% concordance between the two analyses. However, the GSTM5 gene and the neuropilin 1 gene – one of those involved in axon guidance – were both methylated and downregulated in the converters. The transcriptome analysis also found significantly decreased expression (although not methylation) of another gene, carnitine palmitoyltransferase 1A. This is a key enzyme in oxidizing long-chain fatty acids and transporting them into the mitochondria.
Adapting these observed differences in gene expression into a useful clinical tool will be challenging, Dr. Krebs said. In addition to large-group validation, any risk prediction model would have to take into account the many other factors that influence psychosis conversion: cerebral and sexual maturation during adolescence, cannabis use, and stress and other completely individual life experiences.
Nevertheless, she concluded, “longitudinal ‘multi-omics’ may be a step toward a future of personalized molecular psychiatry.”
Dr. Krebs had no relevant financial disclosures.
[email protected]
On Twitter @alz_gal
EXPERT ANALYSIS FROM WPA 2017
Genetic analysis indicates ovarian cancer may originate in fallopian tubes
Many of the most severe ovarian cancer cases may originate in the fallopian tube (FT), based on data from an analysis of nine patients published online in Nature Communications.
“Our data suggest that FT neoplasia is the origin of ovarian serous carcinogenesis, and can directly lead to cancer of the ovaries and of other sites,” wrote Sana Intidhar Labidi-Galy, MD, of Dana-Farber Cancer Institute, Boston, and her colleagues (Nature Commun. 2017 Oct 23. doi: 10.1038/s41467-017-00962-1).
Preliminary evidence suggests that fallopian tube cancers may develop into high-grade serous ovarian carcinoma (HGSOC), but evolutionary evidence is limited, the researchers said.
They conducted genetic sequencing on 37 tumor samples from five adult patients with HGSOC. They identified changes in the TP53 tumor suppressor gene in all cases of HGSOC. They also studied serous tubal intraepithelial carcinomas from four patients.
“As expected, we identified sequence changes in the TP53 tumor suppressor gene, a well-known driver gene in HGSOC, in all cases,” the researchers wrote.
“The TP53 alterations were identical in all samples analyzed for each patient including in the p53 signatures, the [serous tubal intraepithelial carcinoma] lesions, and other carcinomas,” Dr. Labidi-Galy and her associates said. Although TP53 was the only gene analyzed in this study, the researchers also noted changes in areas of several known ovarian cancer genes, including BRCA1 and BRCA2.
The study findings were limited by the small size of the tumor samples and small number of cells, the researchers noted.
The results, however, suggest an avenue for further research to help guide early detection and treatment of ovarian cancer, such as the potential removal of fallopian tubes rather than the ovaries in some cases, they concluded.
The research was supported by multiple foundations and organizations, including the National Institutes of Health. One of the investigators is a founder of Personal Genome Diagnostics and a member of its scientific advisory board and board of directors. The other researchers had no financial conflicts to disclose.
Many of the most severe ovarian cancer cases may originate in the fallopian tube (FT), based on data from an analysis of nine patients published online in Nature Communications.
“Our data suggest that FT neoplasia is the origin of ovarian serous carcinogenesis, and can directly lead to cancer of the ovaries and of other sites,” wrote Sana Intidhar Labidi-Galy, MD, of Dana-Farber Cancer Institute, Boston, and her colleagues (Nature Commun. 2017 Oct 23. doi: 10.1038/s41467-017-00962-1).
Preliminary evidence suggests that fallopian tube cancers may develop into high-grade serous ovarian carcinoma (HGSOC), but evolutionary evidence is limited, the researchers said.
They conducted genetic sequencing on 37 tumor samples from five adult patients with HGSOC. They identified changes in the TP53 tumor suppressor gene in all cases of HGSOC. They also studied serous tubal intraepithelial carcinomas from four patients.
“As expected, we identified sequence changes in the TP53 tumor suppressor gene, a well-known driver gene in HGSOC, in all cases,” the researchers wrote.
“The TP53 alterations were identical in all samples analyzed for each patient including in the p53 signatures, the [serous tubal intraepithelial carcinoma] lesions, and other carcinomas,” Dr. Labidi-Galy and her associates said. Although TP53 was the only gene analyzed in this study, the researchers also noted changes in areas of several known ovarian cancer genes, including BRCA1 and BRCA2.
The study findings were limited by the small size of the tumor samples and small number of cells, the researchers noted.
The results, however, suggest an avenue for further research to help guide early detection and treatment of ovarian cancer, such as the potential removal of fallopian tubes rather than the ovaries in some cases, they concluded.
The research was supported by multiple foundations and organizations, including the National Institutes of Health. One of the investigators is a founder of Personal Genome Diagnostics and a member of its scientific advisory board and board of directors. The other researchers had no financial conflicts to disclose.
Many of the most severe ovarian cancer cases may originate in the fallopian tube (FT), based on data from an analysis of nine patients published online in Nature Communications.
“Our data suggest that FT neoplasia is the origin of ovarian serous carcinogenesis, and can directly lead to cancer of the ovaries and of other sites,” wrote Sana Intidhar Labidi-Galy, MD, of Dana-Farber Cancer Institute, Boston, and her colleagues (Nature Commun. 2017 Oct 23. doi: 10.1038/s41467-017-00962-1).
Preliminary evidence suggests that fallopian tube cancers may develop into high-grade serous ovarian carcinoma (HGSOC), but evolutionary evidence is limited, the researchers said.
They conducted genetic sequencing on 37 tumor samples from five adult patients with HGSOC. They identified changes in the TP53 tumor suppressor gene in all cases of HGSOC. They also studied serous tubal intraepithelial carcinomas from four patients.
“As expected, we identified sequence changes in the TP53 tumor suppressor gene, a well-known driver gene in HGSOC, in all cases,” the researchers wrote.
“The TP53 alterations were identical in all samples analyzed for each patient including in the p53 signatures, the [serous tubal intraepithelial carcinoma] lesions, and other carcinomas,” Dr. Labidi-Galy and her associates said. Although TP53 was the only gene analyzed in this study, the researchers also noted changes in areas of several known ovarian cancer genes, including BRCA1 and BRCA2.
The study findings were limited by the small size of the tumor samples and small number of cells, the researchers noted.
The results, however, suggest an avenue for further research to help guide early detection and treatment of ovarian cancer, such as the potential removal of fallopian tubes rather than the ovaries in some cases, they concluded.
The research was supported by multiple foundations and organizations, including the National Institutes of Health. One of the investigators is a founder of Personal Genome Diagnostics and a member of its scientific advisory board and board of directors. The other researchers had no financial conflicts to disclose.
FROM NATURE COMMUNICATIONS
New biomarkers improve DILI predictability
WASHINGTON – Researchers have identified six new biomarkers of drug-induced liver injury (DILI) that, when combined with traditional measurements, seemed to better predict the disease course, compared with traditional biomarkers alone, according to a presentation at the annual meeting of the American Association for the Study of Liver Diseases.
In addition, some of these biomarkers may provide a “liquid biopsy” to assess degree of inflammation and mode of hepatocyte death, said Rachel Church, PhD, of the University of North Carolina, Chapel Hill.
“The motivation behind this research is that the standard biomarkers for DILI have several shortcomings,” Dr. Church said. “They’re not entirely liver specific, they’re not mechanistically informative, and they’re not sufficiently predictive of outcome.”
The researchers found that elevated levels of these six candidate biomarkers were predictive for adverse outcome in DILI: total keratin18 (K18); caspase-cleaved K18 (ccK18); alpha-fetoprotein (AFP); osteopontin (OPN); fatty acid–binding protein 1 (FABP1); and macrophage colony-stimulating factor receptor (MCSFR) determined by immunoassay. “We believe that using some of these candidate biomarkers in combination with the standard tests may be the best way to identify individuals at risk for an adverse outcome,” Dr. Church said.
While their analysis found that the traditional international normalized ratio had the overall best predictive value, measured as area under the curve (AUC) of 0.922, the candidate biomarker OPN was second best with an AUC of 0.871, “and actually performed better than total bilirubin,” Dr. Church said.
The study evaluated mechanistic candidate biomarkers by obtaining biopsies in a cohort of 27 patients within 2 weeks of diagnosis, focusing on three physiological reactions: inflammation, necrosis, and apoptosis.
With regard to inflammation, Dr. Church said, “What we found was that MCSFR actually was significantly elevated in patients who had a high score for inflammation; however, there was no significant difference in OPN, although there was a slight elevation.”
They evaluated necrosis using a semiquantitative confluent coagulative necrosis score, and found no difference in the typical biomarkers of cell necrosis, such as alanine transminase, aspartate aminotransferase, and K18. “So we also looked at the regenerative biomarkers, OPN and AFP, and indeed, we observed that both were significantly elevated with high confluent coagulative necrosis scores,” she said.
To evaluate apoptosis, the researchers used the semiquantitative apoptosis score. “We found there was a small but significant elevation in ccK18 in individuals with a high apoptosis score,” she said. They then evaluated the ratio of ccK18 to K18. “The closer the score is to 1, the more apoptosis you have; and the closer the score is to 0, the more necrosis you have,” Dr. Church said.
They also developed a predictive model that combined the traditional biomarkers INR, total bilirubin, and aspartate aminotransferase with the candidate biomarkers OPN and K18, which had an AUC of 0.97. “Some analysis of candidate biomarkers in combination with tests such as MELD score [Model for End-Stage Liver Disease] and ‘Hy’s Law’ saw that incorporating candidate biomarkers was useful,” Dr. Church said.
Dr. Church reported having no financial disclosures.
WASHINGTON – Researchers have identified six new biomarkers of drug-induced liver injury (DILI) that, when combined with traditional measurements, seemed to better predict the disease course, compared with traditional biomarkers alone, according to a presentation at the annual meeting of the American Association for the Study of Liver Diseases.
In addition, some of these biomarkers may provide a “liquid biopsy” to assess degree of inflammation and mode of hepatocyte death, said Rachel Church, PhD, of the University of North Carolina, Chapel Hill.
“The motivation behind this research is that the standard biomarkers for DILI have several shortcomings,” Dr. Church said. “They’re not entirely liver specific, they’re not mechanistically informative, and they’re not sufficiently predictive of outcome.”
The researchers found that elevated levels of these six candidate biomarkers were predictive for adverse outcome in DILI: total keratin18 (K18); caspase-cleaved K18 (ccK18); alpha-fetoprotein (AFP); osteopontin (OPN); fatty acid–binding protein 1 (FABP1); and macrophage colony-stimulating factor receptor (MCSFR) determined by immunoassay. “We believe that using some of these candidate biomarkers in combination with the standard tests may be the best way to identify individuals at risk for an adverse outcome,” Dr. Church said.
While their analysis found that the traditional international normalized ratio had the overall best predictive value, measured as area under the curve (AUC) of 0.922, the candidate biomarker OPN was second best with an AUC of 0.871, “and actually performed better than total bilirubin,” Dr. Church said.
The study evaluated mechanistic candidate biomarkers by obtaining biopsies in a cohort of 27 patients within 2 weeks of diagnosis, focusing on three physiological reactions: inflammation, necrosis, and apoptosis.
With regard to inflammation, Dr. Church said, “What we found was that MCSFR actually was significantly elevated in patients who had a high score for inflammation; however, there was no significant difference in OPN, although there was a slight elevation.”
They evaluated necrosis using a semiquantitative confluent coagulative necrosis score, and found no difference in the typical biomarkers of cell necrosis, such as alanine transminase, aspartate aminotransferase, and K18. “So we also looked at the regenerative biomarkers, OPN and AFP, and indeed, we observed that both were significantly elevated with high confluent coagulative necrosis scores,” she said.
To evaluate apoptosis, the researchers used the semiquantitative apoptosis score. “We found there was a small but significant elevation in ccK18 in individuals with a high apoptosis score,” she said. They then evaluated the ratio of ccK18 to K18. “The closer the score is to 1, the more apoptosis you have; and the closer the score is to 0, the more necrosis you have,” Dr. Church said.
They also developed a predictive model that combined the traditional biomarkers INR, total bilirubin, and aspartate aminotransferase with the candidate biomarkers OPN and K18, which had an AUC of 0.97. “Some analysis of candidate biomarkers in combination with tests such as MELD score [Model for End-Stage Liver Disease] and ‘Hy’s Law’ saw that incorporating candidate biomarkers was useful,” Dr. Church said.
Dr. Church reported having no financial disclosures.
WASHINGTON – Researchers have identified six new biomarkers of drug-induced liver injury (DILI) that, when combined with traditional measurements, seemed to better predict the disease course, compared with traditional biomarkers alone, according to a presentation at the annual meeting of the American Association for the Study of Liver Diseases.
In addition, some of these biomarkers may provide a “liquid biopsy” to assess degree of inflammation and mode of hepatocyte death, said Rachel Church, PhD, of the University of North Carolina, Chapel Hill.
“The motivation behind this research is that the standard biomarkers for DILI have several shortcomings,” Dr. Church said. “They’re not entirely liver specific, they’re not mechanistically informative, and they’re not sufficiently predictive of outcome.”
The researchers found that elevated levels of these six candidate biomarkers were predictive for adverse outcome in DILI: total keratin18 (K18); caspase-cleaved K18 (ccK18); alpha-fetoprotein (AFP); osteopontin (OPN); fatty acid–binding protein 1 (FABP1); and macrophage colony-stimulating factor receptor (MCSFR) determined by immunoassay. “We believe that using some of these candidate biomarkers in combination with the standard tests may be the best way to identify individuals at risk for an adverse outcome,” Dr. Church said.
While their analysis found that the traditional international normalized ratio had the overall best predictive value, measured as area under the curve (AUC) of 0.922, the candidate biomarker OPN was second best with an AUC of 0.871, “and actually performed better than total bilirubin,” Dr. Church said.
The study evaluated mechanistic candidate biomarkers by obtaining biopsies in a cohort of 27 patients within 2 weeks of diagnosis, focusing on three physiological reactions: inflammation, necrosis, and apoptosis.
With regard to inflammation, Dr. Church said, “What we found was that MCSFR actually was significantly elevated in patients who had a high score for inflammation; however, there was no significant difference in OPN, although there was a slight elevation.”
They evaluated necrosis using a semiquantitative confluent coagulative necrosis score, and found no difference in the typical biomarkers of cell necrosis, such as alanine transminase, aspartate aminotransferase, and K18. “So we also looked at the regenerative biomarkers, OPN and AFP, and indeed, we observed that both were significantly elevated with high confluent coagulative necrosis scores,” she said.
To evaluate apoptosis, the researchers used the semiquantitative apoptosis score. “We found there was a small but significant elevation in ccK18 in individuals with a high apoptosis score,” she said. They then evaluated the ratio of ccK18 to K18. “The closer the score is to 1, the more apoptosis you have; and the closer the score is to 0, the more necrosis you have,” Dr. Church said.
They also developed a predictive model that combined the traditional biomarkers INR, total bilirubin, and aspartate aminotransferase with the candidate biomarkers OPN and K18, which had an AUC of 0.97. “Some analysis of candidate biomarkers in combination with tests such as MELD score [Model for End-Stage Liver Disease] and ‘Hy’s Law’ saw that incorporating candidate biomarkers was useful,” Dr. Church said.
Dr. Church reported having no financial disclosures.
AT THE LIVER MEETING 2017
Key clinical point: Combining six candidate biomarkers with traditional biomarkers may improve prediction of adverse outcomes in drug-induced liver injury.
Major finding: Candidate biomarker osteopontin had an area under the cure measure of 0.871, second only to the traditional biomarker international normalized ratio and exceeding that of total bilirubin.
Data source: Analysis of serum samples collected by the DILI Network from 145 patients with a greater than 50% likelihood of having DILI.
Disclosures: Dr. Church reported having no financial disclosures.
Firearms’ injury toll of $3 billion just ‘a drop in the bucket’
SAN DIEGO – The true impact of firearms injuries may be greatly underestimated, according to a study presented at the American College of Surgeons Clinical Congress.
An analysis released earlier this month estimated that firearms injuries cost nearly $3 billion a year in emergency department and inpatient treatment costs. The real cost is likely to be 10-20 times higher, said the lead author of the study, Faiz Gani, MD, a research fellow with the Johns Hopkins Surgery Center for Outcomes Research, Baltimore.
“This is just a drop in the bucket,” Dr. Gani said in an interview at the annual clinical congress of the American College of Surgeons.
Dr. Gani and his colleagues launched their study (Health Affairs 2017;36[10]:1729-38) to better understand the cost of firearms injuries, including nonfatal and accidental injuries.
Most estimates of the cost of firearm injuries are outdated or focused on states or single trauma centers, he said. “Contemporary [costs] for emergency rooms are unknown,” he said. “Also, the numbers come down and shoot up. It’s important to continually study this.”
The statistics are especially important to surgeons, who handle these injuries. “A lot of times the surgeon is the primary health care provider if the patient is injured severely. It’s important that we as surgeons know what’s going on.”
The researchers retrospectively analyzed data from the Nationwide Emergency Department Sample of the Healthcare Cost and Utilization Project for the years 2006-2014. They identified 150,930 patients who appeared alive in emergency departments over that period with firearms injuries, and they estimated the total weighted number at 704,916.
They found that the incidence of firearms injury admissions actually fell during 2006-2013 (from 27.9 visits per 100,000 people to 21.5, P < .001) but bumped up by 23.7% to 26.6 during 2013-2014 (P < .001).
Not surprisingly, more men were injured than women: 45.8 firearms-injured men per 100,000 patients presenting at emergency departments, compared with 5.5 firearms-injured women. Assaults (49.5%) and accidents (35.3%) accounted for most cases, followed by attempted suicides (5.3%) and legal intervention (2.4%).
Those who were assaulted had a higher likelihood of being poor, while those who tried to kill themselves were more likely to have the highest incomes among firearms-injured patients.
The average costs of emergency and inpatient care for patients injured by firearms were $5,254 and $95,887, respectively, collectively amounting to about $2.8 billion each year.
Dr. Gani mentioned that the estimation of the cost and impact of firearms injuries don’t account for people who died of firearms injuries before reaching the emergency department, he says, including patients who committed suicide and died at home.
The cost estimates also don’t take follow-up care, rehabilitation, and lifelong disability into account. The surgical portion of the cost is likely to be much higher because the study doesn’t take future surgical procedures into account, he said.
Based on estimates by the Centers for Disease Control and Prevention of the impact of the injuries, Dr. Gani argued that the true annual cost could be 10 or 20 times the nearly $3 billion estimated by the study.
Discussant Elliott R. Haut, MD, FACS, a trauma surgeon at Johns Hopkins Medicine in Baltimore, agreed that the study estimates of cost and impact estimated in the study represent a small part of a larger toll. Some families and individuals can pay those costs more than once. He recalls hearing from family members of firearm victims who recognize him because they’ve been at the hospital for other shooting incidents. “We’ve all heard someone say, ‘You were here the last time when my brother/cousin/uncle was shot,’ ” he said.
Future research should focus on better understanding the long-term cost of firearm injuries and the influence of socioeconomics and demographics, Dr. Gani said.
Dr. Gani and Dr. Haut reported no relevant disclosures.
SAN DIEGO – The true impact of firearms injuries may be greatly underestimated, according to a study presented at the American College of Surgeons Clinical Congress.
An analysis released earlier this month estimated that firearms injuries cost nearly $3 billion a year in emergency department and inpatient treatment costs. The real cost is likely to be 10-20 times higher, said the lead author of the study, Faiz Gani, MD, a research fellow with the Johns Hopkins Surgery Center for Outcomes Research, Baltimore.
“This is just a drop in the bucket,” Dr. Gani said in an interview at the annual clinical congress of the American College of Surgeons.
Dr. Gani and his colleagues launched their study (Health Affairs 2017;36[10]:1729-38) to better understand the cost of firearms injuries, including nonfatal and accidental injuries.
Most estimates of the cost of firearm injuries are outdated or focused on states or single trauma centers, he said. “Contemporary [costs] for emergency rooms are unknown,” he said. “Also, the numbers come down and shoot up. It’s important to continually study this.”
The statistics are especially important to surgeons, who handle these injuries. “A lot of times the surgeon is the primary health care provider if the patient is injured severely. It’s important that we as surgeons know what’s going on.”
The researchers retrospectively analyzed data from the Nationwide Emergency Department Sample of the Healthcare Cost and Utilization Project for the years 2006-2014. They identified 150,930 patients who appeared alive in emergency departments over that period with firearms injuries, and they estimated the total weighted number at 704,916.
They found that the incidence of firearms injury admissions actually fell during 2006-2013 (from 27.9 visits per 100,000 people to 21.5, P < .001) but bumped up by 23.7% to 26.6 during 2013-2014 (P < .001).
Not surprisingly, more men were injured than women: 45.8 firearms-injured men per 100,000 patients presenting at emergency departments, compared with 5.5 firearms-injured women. Assaults (49.5%) and accidents (35.3%) accounted for most cases, followed by attempted suicides (5.3%) and legal intervention (2.4%).
Those who were assaulted had a higher likelihood of being poor, while those who tried to kill themselves were more likely to have the highest incomes among firearms-injured patients.
The average costs of emergency and inpatient care for patients injured by firearms were $5,254 and $95,887, respectively, collectively amounting to about $2.8 billion each year.
Dr. Gani mentioned that the estimation of the cost and impact of firearms injuries don’t account for people who died of firearms injuries before reaching the emergency department, he says, including patients who committed suicide and died at home.
The cost estimates also don’t take follow-up care, rehabilitation, and lifelong disability into account. The surgical portion of the cost is likely to be much higher because the study doesn’t take future surgical procedures into account, he said.
Based on estimates by the Centers for Disease Control and Prevention of the impact of the injuries, Dr. Gani argued that the true annual cost could be 10 or 20 times the nearly $3 billion estimated by the study.
Discussant Elliott R. Haut, MD, FACS, a trauma surgeon at Johns Hopkins Medicine in Baltimore, agreed that the study estimates of cost and impact estimated in the study represent a small part of a larger toll. Some families and individuals can pay those costs more than once. He recalls hearing from family members of firearm victims who recognize him because they’ve been at the hospital for other shooting incidents. “We’ve all heard someone say, ‘You were here the last time when my brother/cousin/uncle was shot,’ ” he said.
Future research should focus on better understanding the long-term cost of firearm injuries and the influence of socioeconomics and demographics, Dr. Gani said.
Dr. Gani and Dr. Haut reported no relevant disclosures.
SAN DIEGO – The true impact of firearms injuries may be greatly underestimated, according to a study presented at the American College of Surgeons Clinical Congress.
An analysis released earlier this month estimated that firearms injuries cost nearly $3 billion a year in emergency department and inpatient treatment costs. The real cost is likely to be 10-20 times higher, said the lead author of the study, Faiz Gani, MD, a research fellow with the Johns Hopkins Surgery Center for Outcomes Research, Baltimore.
“This is just a drop in the bucket,” Dr. Gani said in an interview at the annual clinical congress of the American College of Surgeons.
Dr. Gani and his colleagues launched their study (Health Affairs 2017;36[10]:1729-38) to better understand the cost of firearms injuries, including nonfatal and accidental injuries.
Most estimates of the cost of firearm injuries are outdated or focused on states or single trauma centers, he said. “Contemporary [costs] for emergency rooms are unknown,” he said. “Also, the numbers come down and shoot up. It’s important to continually study this.”
The statistics are especially important to surgeons, who handle these injuries. “A lot of times the surgeon is the primary health care provider if the patient is injured severely. It’s important that we as surgeons know what’s going on.”
The researchers retrospectively analyzed data from the Nationwide Emergency Department Sample of the Healthcare Cost and Utilization Project for the years 2006-2014. They identified 150,930 patients who appeared alive in emergency departments over that period with firearms injuries, and they estimated the total weighted number at 704,916.
They found that the incidence of firearms injury admissions actually fell during 2006-2013 (from 27.9 visits per 100,000 people to 21.5, P < .001) but bumped up by 23.7% to 26.6 during 2013-2014 (P < .001).
Not surprisingly, more men were injured than women: 45.8 firearms-injured men per 100,000 patients presenting at emergency departments, compared with 5.5 firearms-injured women. Assaults (49.5%) and accidents (35.3%) accounted for most cases, followed by attempted suicides (5.3%) and legal intervention (2.4%).
Those who were assaulted had a higher likelihood of being poor, while those who tried to kill themselves were more likely to have the highest incomes among firearms-injured patients.
The average costs of emergency and inpatient care for patients injured by firearms were $5,254 and $95,887, respectively, collectively amounting to about $2.8 billion each year.
Dr. Gani mentioned that the estimation of the cost and impact of firearms injuries don’t account for people who died of firearms injuries before reaching the emergency department, he says, including patients who committed suicide and died at home.
The cost estimates also don’t take follow-up care, rehabilitation, and lifelong disability into account. The surgical portion of the cost is likely to be much higher because the study doesn’t take future surgical procedures into account, he said.
Based on estimates by the Centers for Disease Control and Prevention of the impact of the injuries, Dr. Gani argued that the true annual cost could be 10 or 20 times the nearly $3 billion estimated by the study.
Discussant Elliott R. Haut, MD, FACS, a trauma surgeon at Johns Hopkins Medicine in Baltimore, agreed that the study estimates of cost and impact estimated in the study represent a small part of a larger toll. Some families and individuals can pay those costs more than once. He recalls hearing from family members of firearm victims who recognize him because they’ve been at the hospital for other shooting incidents. “We’ve all heard someone say, ‘You were here the last time when my brother/cousin/uncle was shot,’ ” he said.
Future research should focus on better understanding the long-term cost of firearm injuries and the influence of socioeconomics and demographics, Dr. Gani said.
Dr. Gani and Dr. Haut reported no relevant disclosures.
AT THE ACS CLINICAL CONGRESS
VIDEO: Researchers beginning to explore microbiome’s effect on surgical outcomes
SAN DIEGO – Surgery seems to stimulate abrupt changes in both the skin and gut microbiome, which in some patients may increase the risk of surgical-site infections and anastomotic leaks. With that knowledge, researchers are exploring the very first steps toward a presurgical microbiome optimization protocol, Heidi Nelson, MD, FACS, said at the annual clinical congress of the American College of Surgeons.
It’s very early in the journey, said Dr. Nelson, the Fred C. Andersen Professor of Surgery at Mayo Clinic, Rochester, Minn. The path is not straightforward because the human microbiome appears to be nearly as individually unique as the human fingerprint, so presurgical protocols might have to be individually tailored to each patient.
Dr. Nelson comoderated a session exploring this topic with John Alverdy, MD, FACS, of the University of Chicago. The panel discussed human and animal studies suggesting that the stress of surgery, when combined with subclinical ischemia and any baseline physiologic stress (chronic illness or radiation, for example), can cause some commensals to begin producing collagenase – a change that endangers even surgically sound anastomoses. The skin microbiome is altered as well, with areas around abdominal incisions beginning to express gut flora, which increase the risk of a surgical-site infection.
Through diet or other presurgical interventions, Dr. Nelson said in a video interview, it might be possible to optimize the microbiome and reduce the chances of some of these occurrences.
She had no financial disclosures.
On Twitter @Alz_Gal
SAN DIEGO – Surgery seems to stimulate abrupt changes in both the skin and gut microbiome, which in some patients may increase the risk of surgical-site infections and anastomotic leaks. With that knowledge, researchers are exploring the very first steps toward a presurgical microbiome optimization protocol, Heidi Nelson, MD, FACS, said at the annual clinical congress of the American College of Surgeons.
It’s very early in the journey, said Dr. Nelson, the Fred C. Andersen Professor of Surgery at Mayo Clinic, Rochester, Minn. The path is not straightforward because the human microbiome appears to be nearly as individually unique as the human fingerprint, so presurgical protocols might have to be individually tailored to each patient.
Dr. Nelson comoderated a session exploring this topic with John Alverdy, MD, FACS, of the University of Chicago. The panel discussed human and animal studies suggesting that the stress of surgery, when combined with subclinical ischemia and any baseline physiologic stress (chronic illness or radiation, for example), can cause some commensals to begin producing collagenase – a change that endangers even surgically sound anastomoses. The skin microbiome is altered as well, with areas around abdominal incisions beginning to express gut flora, which increase the risk of a surgical-site infection.
Through diet or other presurgical interventions, Dr. Nelson said in a video interview, it might be possible to optimize the microbiome and reduce the chances of some of these occurrences.
She had no financial disclosures.
On Twitter @Alz_Gal
SAN DIEGO – Surgery seems to stimulate abrupt changes in both the skin and gut microbiome, which in some patients may increase the risk of surgical-site infections and anastomotic leaks. With that knowledge, researchers are exploring the very first steps toward a presurgical microbiome optimization protocol, Heidi Nelson, MD, FACS, said at the annual clinical congress of the American College of Surgeons.
It’s very early in the journey, said Dr. Nelson, the Fred C. Andersen Professor of Surgery at Mayo Clinic, Rochester, Minn. The path is not straightforward because the human microbiome appears to be nearly as individually unique as the human fingerprint, so presurgical protocols might have to be individually tailored to each patient.
Dr. Nelson comoderated a session exploring this topic with John Alverdy, MD, FACS, of the University of Chicago. The panel discussed human and animal studies suggesting that the stress of surgery, when combined with subclinical ischemia and any baseline physiologic stress (chronic illness or radiation, for example), can cause some commensals to begin producing collagenase – a change that endangers even surgically sound anastomoses. The skin microbiome is altered as well, with areas around abdominal incisions beginning to express gut flora, which increase the risk of a surgical-site infection.
Through diet or other presurgical interventions, Dr. Nelson said in a video interview, it might be possible to optimize the microbiome and reduce the chances of some of these occurrences.
She had no financial disclosures.
On Twitter @Alz_Gal
AT THE ACS CLINICAL CONGRESS