Checkpoint inhibitors linked to rare, but serious immune-related side effects

Article Type
Changed
Fri, 12/16/2022 - 11:00

 

Checkpoint inhibitors can cause rare, but serious, hematological immune-related adverse events (hem-irAEs), which require early detection and intervention, according to a recent French study.

Immune thrombocytopenia, hemolytic anemia, and neutropenia were the most common hem-irAEs in the population, reported lead author, Nicolas Delanoy, MD, of Gustave Roussy, Université Paris-Saclay, Villejuif, France, and his colleagues.

“About 71% of patients treated have any-grade irAEs and 10% have grade 3-4 irAEs after anti-PD-1 immunotherapy,” the investigators wrote. The report is in The Lancet Haematology. “In most cases, they involve the skin, gastrointestinal tract, thyroid or endocrine glands, liver, lungs, or joints. However, all organs can potentially be affected, including the hemopoietic system.”

Despite this possibility, few reports detail the frequency or character of hematological toxicities from immunotherapy.

The present study involved 948 patients who entered into three French registries between 2014 and 2018. The first registry, consisting of 745 patients, was observed prospectively during checkpoint inhibitor therapy. The other two registries provided retrospective data on confirmed irAEs or hem-irAEs.

Among 745 patients followed during checkpoint inhibitor therapy, four developed hem-irAEs, providing an incidence rate of 0.5%. The other two databases added 31 patients with confirmed hem-irAEs, allowing for characterization of 35 total cases.

The group of 35 patients had a median age of 65 years, with more men (n = 21) than women (n = 14). Melanoma was the most common type of malignancy (43%), followed by non–small-cell lung cancer (34%), lymphoma (11%), and others. The majority of patients received nivolumab (57%), slightly fewer received pembrolizumab (40%), and a small minority received atezolizumab (3%).

Immune thrombocytopenia, hemolytic anemia, and neutropenia were the most common hem-irAEs, each occurring in nine patients (26%). Five patients (14%) had aplastic anemia or pancytopenia, two patients had bicytopenia (6%; neutropenia and anemia or thrombocytopenia and anemia), and one patient had pure red cell aplasia (3%).

Hem-irAEs resolved in 60% of patients, but two patients (6%) died due to febrile neutropenia. Overall, 71% of hem-irAEs were grade 4.

These findings suggest that hem-irAEs are rare, but they are often serious, and potentially life-threatening, the researchers noted.

In 7 of 35 patients (20%) who were rechallenged with checkpoint inhibitor therapy, 3 (43%) had recurrence of hem-irAEs. This finding should elicit caution and close monitoring if rechallenge is elected.

“This observational study encourages further, in-depth investigations of hematological immune toxicities, to search for biomarkers that can be helpful for earlier detection,” the investigators concluded.

This study was funded by Gustave Roussy and the Gustave Roussy Immunotherapy Program. Dr. Delanoy reported nonfinancial support from Sanofi and other authors reported financial relationships with pharmaceutical companies.

SOURCE: Delanoy N et al. Lancet Haematol. 2018 Dec 4. doi: 10.1016/S2352-3026(18)30175-3.

Publications
Topics
Sections

 

Checkpoint inhibitors can cause rare, but serious, hematological immune-related adverse events (hem-irAEs), which require early detection and intervention, according to a recent French study.

Immune thrombocytopenia, hemolytic anemia, and neutropenia were the most common hem-irAEs in the population, reported lead author, Nicolas Delanoy, MD, of Gustave Roussy, Université Paris-Saclay, Villejuif, France, and his colleagues.

“About 71% of patients treated have any-grade irAEs and 10% have grade 3-4 irAEs after anti-PD-1 immunotherapy,” the investigators wrote. The report is in The Lancet Haematology. “In most cases, they involve the skin, gastrointestinal tract, thyroid or endocrine glands, liver, lungs, or joints. However, all organs can potentially be affected, including the hemopoietic system.”

Despite this possibility, few reports detail the frequency or character of hematological toxicities from immunotherapy.

The present study involved 948 patients who entered into three French registries between 2014 and 2018. The first registry, consisting of 745 patients, was observed prospectively during checkpoint inhibitor therapy. The other two registries provided retrospective data on confirmed irAEs or hem-irAEs.

Among 745 patients followed during checkpoint inhibitor therapy, four developed hem-irAEs, providing an incidence rate of 0.5%. The other two databases added 31 patients with confirmed hem-irAEs, allowing for characterization of 35 total cases.

The group of 35 patients had a median age of 65 years, with more men (n = 21) than women (n = 14). Melanoma was the most common type of malignancy (43%), followed by non–small-cell lung cancer (34%), lymphoma (11%), and others. The majority of patients received nivolumab (57%), slightly fewer received pembrolizumab (40%), and a small minority received atezolizumab (3%).

Immune thrombocytopenia, hemolytic anemia, and neutropenia were the most common hem-irAEs, each occurring in nine patients (26%). Five patients (14%) had aplastic anemia or pancytopenia, two patients had bicytopenia (6%; neutropenia and anemia or thrombocytopenia and anemia), and one patient had pure red cell aplasia (3%).

Hem-irAEs resolved in 60% of patients, but two patients (6%) died due to febrile neutropenia. Overall, 71% of hem-irAEs were grade 4.

These findings suggest that hem-irAEs are rare, but they are often serious, and potentially life-threatening, the researchers noted.

In 7 of 35 patients (20%) who were rechallenged with checkpoint inhibitor therapy, 3 (43%) had recurrence of hem-irAEs. This finding should elicit caution and close monitoring if rechallenge is elected.

“This observational study encourages further, in-depth investigations of hematological immune toxicities, to search for biomarkers that can be helpful for earlier detection,” the investigators concluded.

This study was funded by Gustave Roussy and the Gustave Roussy Immunotherapy Program. Dr. Delanoy reported nonfinancial support from Sanofi and other authors reported financial relationships with pharmaceutical companies.

SOURCE: Delanoy N et al. Lancet Haematol. 2018 Dec 4. doi: 10.1016/S2352-3026(18)30175-3.

 

Checkpoint inhibitors can cause rare, but serious, hematological immune-related adverse events (hem-irAEs), which require early detection and intervention, according to a recent French study.

Immune thrombocytopenia, hemolytic anemia, and neutropenia were the most common hem-irAEs in the population, reported lead author, Nicolas Delanoy, MD, of Gustave Roussy, Université Paris-Saclay, Villejuif, France, and his colleagues.

“About 71% of patients treated have any-grade irAEs and 10% have grade 3-4 irAEs after anti-PD-1 immunotherapy,” the investigators wrote. The report is in The Lancet Haematology. “In most cases, they involve the skin, gastrointestinal tract, thyroid or endocrine glands, liver, lungs, or joints. However, all organs can potentially be affected, including the hemopoietic system.”

Despite this possibility, few reports detail the frequency or character of hematological toxicities from immunotherapy.

The present study involved 948 patients who entered into three French registries between 2014 and 2018. The first registry, consisting of 745 patients, was observed prospectively during checkpoint inhibitor therapy. The other two registries provided retrospective data on confirmed irAEs or hem-irAEs.

Among 745 patients followed during checkpoint inhibitor therapy, four developed hem-irAEs, providing an incidence rate of 0.5%. The other two databases added 31 patients with confirmed hem-irAEs, allowing for characterization of 35 total cases.

The group of 35 patients had a median age of 65 years, with more men (n = 21) than women (n = 14). Melanoma was the most common type of malignancy (43%), followed by non–small-cell lung cancer (34%), lymphoma (11%), and others. The majority of patients received nivolumab (57%), slightly fewer received pembrolizumab (40%), and a small minority received atezolizumab (3%).

Immune thrombocytopenia, hemolytic anemia, and neutropenia were the most common hem-irAEs, each occurring in nine patients (26%). Five patients (14%) had aplastic anemia or pancytopenia, two patients had bicytopenia (6%; neutropenia and anemia or thrombocytopenia and anemia), and one patient had pure red cell aplasia (3%).

Hem-irAEs resolved in 60% of patients, but two patients (6%) died due to febrile neutropenia. Overall, 71% of hem-irAEs were grade 4.

These findings suggest that hem-irAEs are rare, but they are often serious, and potentially life-threatening, the researchers noted.

In 7 of 35 patients (20%) who were rechallenged with checkpoint inhibitor therapy, 3 (43%) had recurrence of hem-irAEs. This finding should elicit caution and close monitoring if rechallenge is elected.

“This observational study encourages further, in-depth investigations of hematological immune toxicities, to search for biomarkers that can be helpful for earlier detection,” the investigators concluded.

This study was funded by Gustave Roussy and the Gustave Roussy Immunotherapy Program. Dr. Delanoy reported nonfinancial support from Sanofi and other authors reported financial relationships with pharmaceutical companies.

SOURCE: Delanoy N et al. Lancet Haematol. 2018 Dec 4. doi: 10.1016/S2352-3026(18)30175-3.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE LANCET HAEMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Checkpoint inhibitors can cause rare, but potentially serious, hematological immune-related adverse events, which require early detection and intervention.

Major finding: Checkpoint inhibitor therapy led to hematological toxicity in 0.5% of patients.

Study details: A study of 948 patients in French registries who were observed prospectively or retrospectively, including a case series of 35 patients treated with checkpoint inhibitor therapy who developed hematologic, immune-related adverse events.

Disclosures: This study was funded by Gustave Roussy and the Gustave Roussy Immunotherapy Program. Dr. Delanoy reported nonfinancial support from Sanofi and other authors reported financial relationships with pharmaceutical companies.

Source: Delanoy N et al. Lancet Haematol. 2018 Dec 4. doi: 10.1016/S2352-3026(18)30175-3.

Disqus Comments
Default
Use ProPublica

CRS/HIPEC safety concerns may be outdated

Risks and benefits of CRS/HIPEC remain unclear
Article Type
Changed
Wed, 05/26/2021 - 13:48

 

Cytoreductive surgery (CRS) with hyperthermic intraperitoneal chemotherapy (HIPEC) appears safe, and concerns about high complication rates may be outdated, according to a retrospective study involving more than 34,000 cases.

Compared with four other surgical oncology procedures considered high risk, CRS/HIPEC had the lowest 30-day mortality rate, reported lead author Jason M. Foster, MD, of the University of Nebraska Medical Center in Omaha, and his colleagues.

“The perception of high morbidity, high mortality, and poor surgical outcomes remains a barrier to CRS/HIPEC patient referral as well as clinical trial development in the United States, despite the published noncomparative data establishing contemporary safety,” the investigators wrote in JAMA Network Open.

The study involved 34,114 patients from the American College of Surgeons National Surgical Quality Improvement Project (NSQIP) database who underwent CRS/HIPEC (n = 1,822), trisegmental hepatectomy (n = 2,449), right lobe hepatectomy (n = 5,109), pancreaticoduodenectomy (Whipple; n = 16,793), or esophagectomy (n = 7,941) during 2005-2015. The investigators rates of overall 30-day postoperative mortality, superficial incisional infection, deep incisional infection, organ space infection, return to operating room, and length of hospital stay.



Analysis revealed that CRS/HIPEC had a 30-day mortality rate of 1.1%, which was lower than rates of 2.5%-3.9% for the comparative procedures. Similarly, organ space infection rate was lowest for CRS/HIPEC (7.2%). Superficial and deep incisional infection rates were 5.4% and 1.7%, respectively, for CRS/HIPEC, lower than all procedures except right lobe hepatectomy, with rates of 4.6% and 1.5%. Return to OR was necessary for 6.8% of CRS/HIPEC patients, a rate similar to the other procedures except esophagectomy, in which return to OR was necessary 14.4% of the time. Finally, CRS/HIPEC had a median length of stay of 8 days, which was slightly longer than right lobe or trisegmental hepatectomy (7 days), but shorter than Whipple procedure or esophagectomy (10 days.)

“This study found that CRS/HIPEC had the lowest mortality risk, almost 50%-75% lower than other advanced oncology surgical procedures,” the investigators noted. “These findings provide objective data to dispel the misperception of morbidity and mortality concerns surrounding CRS/HIPEC, and surgical risk should no longer remain a deterrent to patient referral or development of clinical trials for CRS/HIPEC.”

The study was funded by the Platon Foundation and the Hill Foundation. The authors reported no conflicts of interest.

SOURCE: Foster JM et al. JAMA Netw Open. 2019 Jan 11. doi: 10.1001/jamanetworkopen.2018.6847.

Body

 

The recent study by Foster et al. provides insight into the national safety of reductive surgery combined with hyperthermic intraperitoneal chemotherapy (CRS/HIPEC); however, more detailed safety and efficacy data are needed to influence current practices, according to Margaret Smith, MD, and Hari Nathan, MD, PhD.

A closer look at the Foster et al. study reveals three key limitations: First, “cytoreductive surgery encompasses a wide range of procedures, from resection of one peritoneal nodule to multivisceral resection with peritoneal stripping, and, thus, reflects a wide range of possible morbidity,” the authors wrote in an editorial for JAMA Network Open. Therefore, the findings may not represent certain patient populations.

Second, “comparison with other procedures for different indications constructs a straw man.” In contrast with some candidates for CRS/HIPEC, “a patient with pancreatic cancer has no other curative option besides a Whipple procedure.” This imperfect comparison should be considered as such.

Third, the safety of CRS/HIPEC may not be the procedure’s primary limitation. “A more salient concern may be its oncologic effectiveness,” the authors wrote.

Although a clinical randomized trial from 2003 involving patients with colorectal peritoneal carcinomatosis showed a near doubling of overall survival with CRS/HIPEC, compared with systemic chemotherapy alone (22 vs. 12.5 months), a comprehensive understanding of safety and efficacy is lacking, particularly regarding the inclusion of HIPEC. For example, the recent phase 3 Prodige 7 trial showed that addition of HIPEC to CRS added morbidity without survival advantage in patients with colorectal peritoneal carcinomatosis; in contrast, a separate phase 3 trial in epithelial ovarian cancer showed that adding HIPEC to CRS did extend survival.

“…Others have cautioned against changing practice based on these results given concerns over small sample size, imbalances in effects seen across centers, and overall survival with CRS/HIPEC that was similar to other studies’ reported survival following interval debulking alone. Legitimate concerns regarding the efficacy of CRS/HIPEC exist, and appropriate patient selection for this aggressive treatment remains a challenge. Foster et al. demonstrates acceptable morbidity and mortality rates for CRS/HIPEC in this highly selected patient cohort. However, until the benefit for individual patients is more thoroughly understood, clinician referral and treatment practices will remain difficult to transform,” the authors wrote.

Dr. Smith and Dr. Nathan are affiliated with Michigan Medicine at the University of Michigan in Ann Arbor. These comments are adapted from the accompanying editorial (JAMA Netw Open 2019 Jan 11. doi:10.1001/jamanetworkopen.2018.6839).

Publications
Topics
Sections
Body

 

The recent study by Foster et al. provides insight into the national safety of reductive surgery combined with hyperthermic intraperitoneal chemotherapy (CRS/HIPEC); however, more detailed safety and efficacy data are needed to influence current practices, according to Margaret Smith, MD, and Hari Nathan, MD, PhD.

A closer look at the Foster et al. study reveals three key limitations: First, “cytoreductive surgery encompasses a wide range of procedures, from resection of one peritoneal nodule to multivisceral resection with peritoneal stripping, and, thus, reflects a wide range of possible morbidity,” the authors wrote in an editorial for JAMA Network Open. Therefore, the findings may not represent certain patient populations.

Second, “comparison with other procedures for different indications constructs a straw man.” In contrast with some candidates for CRS/HIPEC, “a patient with pancreatic cancer has no other curative option besides a Whipple procedure.” This imperfect comparison should be considered as such.

Third, the safety of CRS/HIPEC may not be the procedure’s primary limitation. “A more salient concern may be its oncologic effectiveness,” the authors wrote.

Although a clinical randomized trial from 2003 involving patients with colorectal peritoneal carcinomatosis showed a near doubling of overall survival with CRS/HIPEC, compared with systemic chemotherapy alone (22 vs. 12.5 months), a comprehensive understanding of safety and efficacy is lacking, particularly regarding the inclusion of HIPEC. For example, the recent phase 3 Prodige 7 trial showed that addition of HIPEC to CRS added morbidity without survival advantage in patients with colorectal peritoneal carcinomatosis; in contrast, a separate phase 3 trial in epithelial ovarian cancer showed that adding HIPEC to CRS did extend survival.

“…Others have cautioned against changing practice based on these results given concerns over small sample size, imbalances in effects seen across centers, and overall survival with CRS/HIPEC that was similar to other studies’ reported survival following interval debulking alone. Legitimate concerns regarding the efficacy of CRS/HIPEC exist, and appropriate patient selection for this aggressive treatment remains a challenge. Foster et al. demonstrates acceptable morbidity and mortality rates for CRS/HIPEC in this highly selected patient cohort. However, until the benefit for individual patients is more thoroughly understood, clinician referral and treatment practices will remain difficult to transform,” the authors wrote.

Dr. Smith and Dr. Nathan are affiliated with Michigan Medicine at the University of Michigan in Ann Arbor. These comments are adapted from the accompanying editorial (JAMA Netw Open 2019 Jan 11. doi:10.1001/jamanetworkopen.2018.6839).

Body

 

The recent study by Foster et al. provides insight into the national safety of reductive surgery combined with hyperthermic intraperitoneal chemotherapy (CRS/HIPEC); however, more detailed safety and efficacy data are needed to influence current practices, according to Margaret Smith, MD, and Hari Nathan, MD, PhD.

A closer look at the Foster et al. study reveals three key limitations: First, “cytoreductive surgery encompasses a wide range of procedures, from resection of one peritoneal nodule to multivisceral resection with peritoneal stripping, and, thus, reflects a wide range of possible morbidity,” the authors wrote in an editorial for JAMA Network Open. Therefore, the findings may not represent certain patient populations.

Second, “comparison with other procedures for different indications constructs a straw man.” In contrast with some candidates for CRS/HIPEC, “a patient with pancreatic cancer has no other curative option besides a Whipple procedure.” This imperfect comparison should be considered as such.

Third, the safety of CRS/HIPEC may not be the procedure’s primary limitation. “A more salient concern may be its oncologic effectiveness,” the authors wrote.

Although a clinical randomized trial from 2003 involving patients with colorectal peritoneal carcinomatosis showed a near doubling of overall survival with CRS/HIPEC, compared with systemic chemotherapy alone (22 vs. 12.5 months), a comprehensive understanding of safety and efficacy is lacking, particularly regarding the inclusion of HIPEC. For example, the recent phase 3 Prodige 7 trial showed that addition of HIPEC to CRS added morbidity without survival advantage in patients with colorectal peritoneal carcinomatosis; in contrast, a separate phase 3 trial in epithelial ovarian cancer showed that adding HIPEC to CRS did extend survival.

“…Others have cautioned against changing practice based on these results given concerns over small sample size, imbalances in effects seen across centers, and overall survival with CRS/HIPEC that was similar to other studies’ reported survival following interval debulking alone. Legitimate concerns regarding the efficacy of CRS/HIPEC exist, and appropriate patient selection for this aggressive treatment remains a challenge. Foster et al. demonstrates acceptable morbidity and mortality rates for CRS/HIPEC in this highly selected patient cohort. However, until the benefit for individual patients is more thoroughly understood, clinician referral and treatment practices will remain difficult to transform,” the authors wrote.

Dr. Smith and Dr. Nathan are affiliated with Michigan Medicine at the University of Michigan in Ann Arbor. These comments are adapted from the accompanying editorial (JAMA Netw Open 2019 Jan 11. doi:10.1001/jamanetworkopen.2018.6839).

Title
Risks and benefits of CRS/HIPEC remain unclear
Risks and benefits of CRS/HIPEC remain unclear

 

Cytoreductive surgery (CRS) with hyperthermic intraperitoneal chemotherapy (HIPEC) appears safe, and concerns about high complication rates may be outdated, according to a retrospective study involving more than 34,000 cases.

Compared with four other surgical oncology procedures considered high risk, CRS/HIPEC had the lowest 30-day mortality rate, reported lead author Jason M. Foster, MD, of the University of Nebraska Medical Center in Omaha, and his colleagues.

“The perception of high morbidity, high mortality, and poor surgical outcomes remains a barrier to CRS/HIPEC patient referral as well as clinical trial development in the United States, despite the published noncomparative data establishing contemporary safety,” the investigators wrote in JAMA Network Open.

The study involved 34,114 patients from the American College of Surgeons National Surgical Quality Improvement Project (NSQIP) database who underwent CRS/HIPEC (n = 1,822), trisegmental hepatectomy (n = 2,449), right lobe hepatectomy (n = 5,109), pancreaticoduodenectomy (Whipple; n = 16,793), or esophagectomy (n = 7,941) during 2005-2015. The investigators rates of overall 30-day postoperative mortality, superficial incisional infection, deep incisional infection, organ space infection, return to operating room, and length of hospital stay.



Analysis revealed that CRS/HIPEC had a 30-day mortality rate of 1.1%, which was lower than rates of 2.5%-3.9% for the comparative procedures. Similarly, organ space infection rate was lowest for CRS/HIPEC (7.2%). Superficial and deep incisional infection rates were 5.4% and 1.7%, respectively, for CRS/HIPEC, lower than all procedures except right lobe hepatectomy, with rates of 4.6% and 1.5%. Return to OR was necessary for 6.8% of CRS/HIPEC patients, a rate similar to the other procedures except esophagectomy, in which return to OR was necessary 14.4% of the time. Finally, CRS/HIPEC had a median length of stay of 8 days, which was slightly longer than right lobe or trisegmental hepatectomy (7 days), but shorter than Whipple procedure or esophagectomy (10 days.)

“This study found that CRS/HIPEC had the lowest mortality risk, almost 50%-75% lower than other advanced oncology surgical procedures,” the investigators noted. “These findings provide objective data to dispel the misperception of morbidity and mortality concerns surrounding CRS/HIPEC, and surgical risk should no longer remain a deterrent to patient referral or development of clinical trials for CRS/HIPEC.”

The study was funded by the Platon Foundation and the Hill Foundation. The authors reported no conflicts of interest.

SOURCE: Foster JM et al. JAMA Netw Open. 2019 Jan 11. doi: 10.1001/jamanetworkopen.2018.6847.

 

Cytoreductive surgery (CRS) with hyperthermic intraperitoneal chemotherapy (HIPEC) appears safe, and concerns about high complication rates may be outdated, according to a retrospective study involving more than 34,000 cases.

Compared with four other surgical oncology procedures considered high risk, CRS/HIPEC had the lowest 30-day mortality rate, reported lead author Jason M. Foster, MD, of the University of Nebraska Medical Center in Omaha, and his colleagues.

“The perception of high morbidity, high mortality, and poor surgical outcomes remains a barrier to CRS/HIPEC patient referral as well as clinical trial development in the United States, despite the published noncomparative data establishing contemporary safety,” the investigators wrote in JAMA Network Open.

The study involved 34,114 patients from the American College of Surgeons National Surgical Quality Improvement Project (NSQIP) database who underwent CRS/HIPEC (n = 1,822), trisegmental hepatectomy (n = 2,449), right lobe hepatectomy (n = 5,109), pancreaticoduodenectomy (Whipple; n = 16,793), or esophagectomy (n = 7,941) during 2005-2015. The investigators rates of overall 30-day postoperative mortality, superficial incisional infection, deep incisional infection, organ space infection, return to operating room, and length of hospital stay.



Analysis revealed that CRS/HIPEC had a 30-day mortality rate of 1.1%, which was lower than rates of 2.5%-3.9% for the comparative procedures. Similarly, organ space infection rate was lowest for CRS/HIPEC (7.2%). Superficial and deep incisional infection rates were 5.4% and 1.7%, respectively, for CRS/HIPEC, lower than all procedures except right lobe hepatectomy, with rates of 4.6% and 1.5%. Return to OR was necessary for 6.8% of CRS/HIPEC patients, a rate similar to the other procedures except esophagectomy, in which return to OR was necessary 14.4% of the time. Finally, CRS/HIPEC had a median length of stay of 8 days, which was slightly longer than right lobe or trisegmental hepatectomy (7 days), but shorter than Whipple procedure or esophagectomy (10 days.)

“This study found that CRS/HIPEC had the lowest mortality risk, almost 50%-75% lower than other advanced oncology surgical procedures,” the investigators noted. “These findings provide objective data to dispel the misperception of morbidity and mortality concerns surrounding CRS/HIPEC, and surgical risk should no longer remain a deterrent to patient referral or development of clinical trials for CRS/HIPEC.”

The study was funded by the Platon Foundation and the Hill Foundation. The authors reported no conflicts of interest.

SOURCE: Foster JM et al. JAMA Netw Open. 2019 Jan 11. doi: 10.1001/jamanetworkopen.2018.6847.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Cytoreductive surgery (CRS) with hyperthermic intraperitoneal chemotherapy (HIPEC) appears safe, and concerns about high complication rates may be outdated.

Major finding: CRS/HIPEC had a 30-day mortality rate of 1.1%, which was lower than rates of 2.5%-3.9% for comparative high-risk surgical oncology procedures.

Study details: A retrospective study of 34,114 patients from the American College of Surgeons National Surgical Quality Improvement Project (NSQIP) database who underwent CRS/HIPEC (n = 1,822), trisegmental hepatectomy (n = 2,449), right lobe hepatectomy (n = 5,109), pancreaticoduodenectomy (Whipple; n = 16,793), or esophagectomy (n = 7,941) during 2005-2015.

Disclosures: The study was funded by the Platon Foundation and the Hill Foundation. The authors reported no conflicts of interest.

Source: Foster JM et al. JAMA Netw Open. 2019 Jan 11. doi: 10.1001/jamanetworkopen.2018.6847.

Disqus Comments
Default
Use ProPublica

Dietary aluminum may trigger IBS

Aluminum hypothesis hard to test in humans
Article Type
Changed
Thu, 01/24/2019 - 12:28

Aluminum ingested in small amounts causes visceral hypersensitivity in rats, suggesting that dietary levels of aluminum may trigger irritable bowel syndrome (IBS) in humans, according to a study published in Cellular and Molecular Gastroenterology and Hepatology.

Rats given oral aluminum exhibited dose-dependent visceral pain along with activation of proteinase-activated receptor-2 (PAR2) and mast cell degranulation, a combination of events that mirror clinical signs and molecular mechanisms of IBS in humans, reported lead author, Nicolas Esquerre, PhD, of Lille Inflammation Research International Center at Université Lille in France, and his colleagues. The study contributes to ongoing research surrounding causes and mechanisms of IBS, which may vary among patients because of disease subsets. These findings suggest that some patients with IBS may benefit from dietary aluminum restriction or chelation therapy.

“[T]he question of the initial trigger [of IBS] still remains unresolved,” the investigators wrote. “A more precise link between food and IBS has been demonstrated for gluten and other wheat proteins, lactose, and nickel, highlighting particular subsets of IBS patients now diagnosed as nonceliac gluten/wheat sensitivity, lactose intolerance, and nickel-allergic contact mucositis,” they added. “Here, we evaluated the effect of aluminum, a common contaminant of food and water, on abdominal pain.”

Aluminum may enter the diet as a food additive, or it may contaminate foods grown in aluminum-rich soil. Other sources of oral exposure include packaging and kitchenware. A previous study showed that most Americans ingest 0.01-1.4 mg/kg of aluminum daily, and 5% ingest 1.58 mg/kg daily (i.e., 95 mg per day for a 60-kg person).

Based on these statistics, rats in the present study received oral aluminum citrate (AlCi) corresponding with three doses of aluminum: 0.5 mg/kg, 1.5 mg/kg, or 3.0 mg/kg. Treatment continued for 30 days, with colorectal distension (CRD) measured on days 2, 4, 8, 15, and 30.

Results showed a dose-dependent relationship between aluminum ingestion and visceral hypersensitivity. Within 2 days, rats receiving 3.0 mg/kg of aluminum exhibited a significantly lower pain threshold, and within 8 days, rats receiving 0.5 mg/kg and 1.5 mg/kg also showed increased visceral hypersensitivity.

After 1 month of treatment, rats receiving 1.5 mg/kg per day demonstrated a 30% increase in pain compared with control animals. In the same group, visceral hypersensitivity began to wane 7 days after cessation of treatment; 4 more weeks were needed to return to baseline. When treatment was restarted, visceral hypersensitivity occurred within 2 days, compared with 8 days upon initial administration. These findings are particularly relevant to some people, as the 1.5-mg/kg dose corresponds with the daily amount of aluminum ingested by 5% of Americans. Similar patterns of response and sensitization were observed in rats ingesting 0.5 mg/kg and 3.0 mg/kg. Female rats were more sensitive to aluminum than were male rats, a sex pattern that mimics human IBS.

Further testing showed that rats treated with zinc citrate (ZnCi) did not exhibit changes to pain threshold, thereby excluding citrate as an aggravating factor. Rat models of noninflammatory and inflammatory colonic hypersensitivity (butyrate enema or intrathecal injection of 25%-50% ethanol in combination with 2,4,6-trinitrobenzenesulfonic acid, respectively) had visceral hypersensitivity similar to that of rats in the 1.5-mg/kg AlCi group.

Testing of colonic tissue from AlCi-treated rats did not reveal inflammatory changes according to a variety of qualifiers, including histology, myeloperoxidase activity, mRNA expression of several inflammatory cytokines, or infiltration of eosinophils or macrophages. Noninflammatory effects of aluminum, however, were found. For instance, treated rats had lower serotonin levels in enteroendocrine cells.

“Enteroendocrine cells are specialized epithelial cells that respond to luminal stimuli by releasing various biologically active compounds,” the investigators wrote. “They regulate several physiological and homeostatic functions of the gastrointestinal tract, such as postprandial secretion, motility, immune responses, and sensory functions. A reduced number of enteroendocrine cells has been observed in the duodenum, ileum, and colon of some patients with IBS.”

 

 


In addition to changes in enteroendocrine cells, AlCi-treated rats had greater colonic mast cell degranulation and histamine with upregulation of histidine decarboxylase transcripts, suggesting that aluminum activated mast cells.

To determine the role of mast cell activation in visceral hypersensitivity, rats were given AlCi with cromoglycate, an inhibitor of mast cell degranulation. This treatment reduced mast cell degranulation and visceral pain threshold, compared with AlCi-treated rats not receiving cromoglycate, suggesting that mast cell degranulation is a primary driver of visceral hypersensitivity. This observation was confirmed by a mast cell–deficient mouse strain (Kit W-sh/W-sh), that had a normal number of mast cells incapable of degranulation. Treating the mast cell–deficient mice with AlCi did not induce visceral hypersensitivity, thereby confirming the role of mast cell degranulation.

Along with mast cell degranulation, AlCi treatment led to PAR2 activation. Investigators explored the significance of this finding with PAR2 knockout mice. When treated with AlCi, PAR2 knockout mice showed no increase in visceral hypersensitivity, suggesting that hypersensitivity is dependent on PAR2 activation. Further testing revealed that mast cell–deficient mice (Kit W-sh/W-sh) did not have PAR2 upregulation either, connecting a sequence in which aluminum triggers mast cell degranulation, mast cell degranulation drives PAR2 upregulation, and PAR2 upregulation causes visceral hypersensitivity. The latter two events in this chain – mast cell degranulation and PAR2 upregulation – mirror molecular mechanisms of IBS in humans.

“We speculate that aluminum activates mast cells to release mediators that can increase excitability of nociceptive afferences contributing to the visceral pain phenotype,” the investigators wrote. “Taken together, our results linked aluminum to several mechanisms implicated in IBS pathophysiology, highlighting a possible role for aluminum as a triggering factor in IBS development.”

The investigators suggested that these findings could influence preventive or therapeutic strategies: “Aluminum might be the first identified dietary risk factor for IBS, implying that measures to limit aluminum dietary consumption or to chelate aluminum may represent novel pathways of prevention and treatment of IBS in some susceptible patients,” they wrote.

The study was funded by the European Fund for Regional Economic Development; the Hauts de France Region, Ministère de l’Enseignement Supérieur et de la Recherche (CPER IRENI); and Digestscience (European Research Foundation on Intestinal Diseases and Nutrition).

SOURCE: Esquerre N et al. Cell Mol Gastroenterol Hepatol. 2019 Sep 20. doi: 10.1016/j.jcmgh.2018.09.012.

Body

 

Irritable bowel syndrome is a chronic functional gastrointestinal disorder, characterized by relapsing/remitting diarrhea, constipation, and visceral pain. IBS afflicts 10%-25% of the population in developed countries.

Dr. Andrew Ted Gewirtz

Despite histologically normal intestinal biopsy specimens, biological signatures of IBS include alterations in intestinal gene expression, increased gut permeability, and changes in gut microbiota composition. Thus, although the cause or causes of IBS are not defined, these and other data highlight the enormous breadth of factors that might play a role in this disorder. Similar alterations also are associated with inflammatory bowel disease (IBD), although the magnitude of changes is typically greater in IBD. Nevertheless, these data suggest that IBS and IBD may share triggers and pathogenetic mechanisms. That prevalence of both IBS and IBD have shown marked increases in incidence, roughly paralleling the modernization of society that accelerated in the mid-20th century, raises the possibility that environmental factors associated with human activity may be a driver of both diseases. Recent findings suggest that aluminum may be one such trigger. While humans have always been exposed to aluminum, the most abundant metal on earth, industrialization has increased the magnitude of exposure owing to the use of aluminum salts as stabilizers in processed foods and the concentration of ground water aluminum in agricultural products. Mimicking estimated average human ingestion of aluminum via administering it orally to rats increases their perception of visceral pain. These results suggest a possible role for increased exposure to aluminum in driving the post–mid-20th-century increased incidence of IBS. Unfortunately, only broad societal estimates of aluminum exposure are available, and aluminum levels are difficult to measure in individuals, making it difficult to epidemiologically investigate the role of aluminum in promoting GI disease in humans. Hence, I submit that levels of aluminum ingestion by humans should be more closely monitored and the potential of aluminum to promote GI disease carefully scrutinized.

Andrew Ted Gewirtz, PhD, distinguished university center professor, Georgia State University’s Institute for Biomedical Sciences’ Center for Inflammation, Immunity and Infection, Atlanta.

Publications
Topics
Sections
Body

 

Irritable bowel syndrome is a chronic functional gastrointestinal disorder, characterized by relapsing/remitting diarrhea, constipation, and visceral pain. IBS afflicts 10%-25% of the population in developed countries.

Dr. Andrew Ted Gewirtz

Despite histologically normal intestinal biopsy specimens, biological signatures of IBS include alterations in intestinal gene expression, increased gut permeability, and changes in gut microbiota composition. Thus, although the cause or causes of IBS are not defined, these and other data highlight the enormous breadth of factors that might play a role in this disorder. Similar alterations also are associated with inflammatory bowel disease (IBD), although the magnitude of changes is typically greater in IBD. Nevertheless, these data suggest that IBS and IBD may share triggers and pathogenetic mechanisms. That prevalence of both IBS and IBD have shown marked increases in incidence, roughly paralleling the modernization of society that accelerated in the mid-20th century, raises the possibility that environmental factors associated with human activity may be a driver of both diseases. Recent findings suggest that aluminum may be one such trigger. While humans have always been exposed to aluminum, the most abundant metal on earth, industrialization has increased the magnitude of exposure owing to the use of aluminum salts as stabilizers in processed foods and the concentration of ground water aluminum in agricultural products. Mimicking estimated average human ingestion of aluminum via administering it orally to rats increases their perception of visceral pain. These results suggest a possible role for increased exposure to aluminum in driving the post–mid-20th-century increased incidence of IBS. Unfortunately, only broad societal estimates of aluminum exposure are available, and aluminum levels are difficult to measure in individuals, making it difficult to epidemiologically investigate the role of aluminum in promoting GI disease in humans. Hence, I submit that levels of aluminum ingestion by humans should be more closely monitored and the potential of aluminum to promote GI disease carefully scrutinized.

Andrew Ted Gewirtz, PhD, distinguished university center professor, Georgia State University’s Institute for Biomedical Sciences’ Center for Inflammation, Immunity and Infection, Atlanta.

Body

 

Irritable bowel syndrome is a chronic functional gastrointestinal disorder, characterized by relapsing/remitting diarrhea, constipation, and visceral pain. IBS afflicts 10%-25% of the population in developed countries.

Dr. Andrew Ted Gewirtz

Despite histologically normal intestinal biopsy specimens, biological signatures of IBS include alterations in intestinal gene expression, increased gut permeability, and changes in gut microbiota composition. Thus, although the cause or causes of IBS are not defined, these and other data highlight the enormous breadth of factors that might play a role in this disorder. Similar alterations also are associated with inflammatory bowel disease (IBD), although the magnitude of changes is typically greater in IBD. Nevertheless, these data suggest that IBS and IBD may share triggers and pathogenetic mechanisms. That prevalence of both IBS and IBD have shown marked increases in incidence, roughly paralleling the modernization of society that accelerated in the mid-20th century, raises the possibility that environmental factors associated with human activity may be a driver of both diseases. Recent findings suggest that aluminum may be one such trigger. While humans have always been exposed to aluminum, the most abundant metal on earth, industrialization has increased the magnitude of exposure owing to the use of aluminum salts as stabilizers in processed foods and the concentration of ground water aluminum in agricultural products. Mimicking estimated average human ingestion of aluminum via administering it orally to rats increases their perception of visceral pain. These results suggest a possible role for increased exposure to aluminum in driving the post–mid-20th-century increased incidence of IBS. Unfortunately, only broad societal estimates of aluminum exposure are available, and aluminum levels are difficult to measure in individuals, making it difficult to epidemiologically investigate the role of aluminum in promoting GI disease in humans. Hence, I submit that levels of aluminum ingestion by humans should be more closely monitored and the potential of aluminum to promote GI disease carefully scrutinized.

Andrew Ted Gewirtz, PhD, distinguished university center professor, Georgia State University’s Institute for Biomedical Sciences’ Center for Inflammation, Immunity and Infection, Atlanta.

Title
Aluminum hypothesis hard to test in humans
Aluminum hypothesis hard to test in humans

Aluminum ingested in small amounts causes visceral hypersensitivity in rats, suggesting that dietary levels of aluminum may trigger irritable bowel syndrome (IBS) in humans, according to a study published in Cellular and Molecular Gastroenterology and Hepatology.

Rats given oral aluminum exhibited dose-dependent visceral pain along with activation of proteinase-activated receptor-2 (PAR2) and mast cell degranulation, a combination of events that mirror clinical signs and molecular mechanisms of IBS in humans, reported lead author, Nicolas Esquerre, PhD, of Lille Inflammation Research International Center at Université Lille in France, and his colleagues. The study contributes to ongoing research surrounding causes and mechanisms of IBS, which may vary among patients because of disease subsets. These findings suggest that some patients with IBS may benefit from dietary aluminum restriction or chelation therapy.

“[T]he question of the initial trigger [of IBS] still remains unresolved,” the investigators wrote. “A more precise link between food and IBS has been demonstrated for gluten and other wheat proteins, lactose, and nickel, highlighting particular subsets of IBS patients now diagnosed as nonceliac gluten/wheat sensitivity, lactose intolerance, and nickel-allergic contact mucositis,” they added. “Here, we evaluated the effect of aluminum, a common contaminant of food and water, on abdominal pain.”

Aluminum may enter the diet as a food additive, or it may contaminate foods grown in aluminum-rich soil. Other sources of oral exposure include packaging and kitchenware. A previous study showed that most Americans ingest 0.01-1.4 mg/kg of aluminum daily, and 5% ingest 1.58 mg/kg daily (i.e., 95 mg per day for a 60-kg person).

Based on these statistics, rats in the present study received oral aluminum citrate (AlCi) corresponding with three doses of aluminum: 0.5 mg/kg, 1.5 mg/kg, or 3.0 mg/kg. Treatment continued for 30 days, with colorectal distension (CRD) measured on days 2, 4, 8, 15, and 30.

Results showed a dose-dependent relationship between aluminum ingestion and visceral hypersensitivity. Within 2 days, rats receiving 3.0 mg/kg of aluminum exhibited a significantly lower pain threshold, and within 8 days, rats receiving 0.5 mg/kg and 1.5 mg/kg also showed increased visceral hypersensitivity.

After 1 month of treatment, rats receiving 1.5 mg/kg per day demonstrated a 30% increase in pain compared with control animals. In the same group, visceral hypersensitivity began to wane 7 days after cessation of treatment; 4 more weeks were needed to return to baseline. When treatment was restarted, visceral hypersensitivity occurred within 2 days, compared with 8 days upon initial administration. These findings are particularly relevant to some people, as the 1.5-mg/kg dose corresponds with the daily amount of aluminum ingested by 5% of Americans. Similar patterns of response and sensitization were observed in rats ingesting 0.5 mg/kg and 3.0 mg/kg. Female rats were more sensitive to aluminum than were male rats, a sex pattern that mimics human IBS.

Further testing showed that rats treated with zinc citrate (ZnCi) did not exhibit changes to pain threshold, thereby excluding citrate as an aggravating factor. Rat models of noninflammatory and inflammatory colonic hypersensitivity (butyrate enema or intrathecal injection of 25%-50% ethanol in combination with 2,4,6-trinitrobenzenesulfonic acid, respectively) had visceral hypersensitivity similar to that of rats in the 1.5-mg/kg AlCi group.

Testing of colonic tissue from AlCi-treated rats did not reveal inflammatory changes according to a variety of qualifiers, including histology, myeloperoxidase activity, mRNA expression of several inflammatory cytokines, or infiltration of eosinophils or macrophages. Noninflammatory effects of aluminum, however, were found. For instance, treated rats had lower serotonin levels in enteroendocrine cells.

“Enteroendocrine cells are specialized epithelial cells that respond to luminal stimuli by releasing various biologically active compounds,” the investigators wrote. “They regulate several physiological and homeostatic functions of the gastrointestinal tract, such as postprandial secretion, motility, immune responses, and sensory functions. A reduced number of enteroendocrine cells has been observed in the duodenum, ileum, and colon of some patients with IBS.”

 

 


In addition to changes in enteroendocrine cells, AlCi-treated rats had greater colonic mast cell degranulation and histamine with upregulation of histidine decarboxylase transcripts, suggesting that aluminum activated mast cells.

To determine the role of mast cell activation in visceral hypersensitivity, rats were given AlCi with cromoglycate, an inhibitor of mast cell degranulation. This treatment reduced mast cell degranulation and visceral pain threshold, compared with AlCi-treated rats not receiving cromoglycate, suggesting that mast cell degranulation is a primary driver of visceral hypersensitivity. This observation was confirmed by a mast cell–deficient mouse strain (Kit W-sh/W-sh), that had a normal number of mast cells incapable of degranulation. Treating the mast cell–deficient mice with AlCi did not induce visceral hypersensitivity, thereby confirming the role of mast cell degranulation.

Along with mast cell degranulation, AlCi treatment led to PAR2 activation. Investigators explored the significance of this finding with PAR2 knockout mice. When treated with AlCi, PAR2 knockout mice showed no increase in visceral hypersensitivity, suggesting that hypersensitivity is dependent on PAR2 activation. Further testing revealed that mast cell–deficient mice (Kit W-sh/W-sh) did not have PAR2 upregulation either, connecting a sequence in which aluminum triggers mast cell degranulation, mast cell degranulation drives PAR2 upregulation, and PAR2 upregulation causes visceral hypersensitivity. The latter two events in this chain – mast cell degranulation and PAR2 upregulation – mirror molecular mechanisms of IBS in humans.

“We speculate that aluminum activates mast cells to release mediators that can increase excitability of nociceptive afferences contributing to the visceral pain phenotype,” the investigators wrote. “Taken together, our results linked aluminum to several mechanisms implicated in IBS pathophysiology, highlighting a possible role for aluminum as a triggering factor in IBS development.”

The investigators suggested that these findings could influence preventive or therapeutic strategies: “Aluminum might be the first identified dietary risk factor for IBS, implying that measures to limit aluminum dietary consumption or to chelate aluminum may represent novel pathways of prevention and treatment of IBS in some susceptible patients,” they wrote.

The study was funded by the European Fund for Regional Economic Development; the Hauts de France Region, Ministère de l’Enseignement Supérieur et de la Recherche (CPER IRENI); and Digestscience (European Research Foundation on Intestinal Diseases and Nutrition).

SOURCE: Esquerre N et al. Cell Mol Gastroenterol Hepatol. 2019 Sep 20. doi: 10.1016/j.jcmgh.2018.09.012.

Aluminum ingested in small amounts causes visceral hypersensitivity in rats, suggesting that dietary levels of aluminum may trigger irritable bowel syndrome (IBS) in humans, according to a study published in Cellular and Molecular Gastroenterology and Hepatology.

Rats given oral aluminum exhibited dose-dependent visceral pain along with activation of proteinase-activated receptor-2 (PAR2) and mast cell degranulation, a combination of events that mirror clinical signs and molecular mechanisms of IBS in humans, reported lead author, Nicolas Esquerre, PhD, of Lille Inflammation Research International Center at Université Lille in France, and his colleagues. The study contributes to ongoing research surrounding causes and mechanisms of IBS, which may vary among patients because of disease subsets. These findings suggest that some patients with IBS may benefit from dietary aluminum restriction or chelation therapy.

“[T]he question of the initial trigger [of IBS] still remains unresolved,” the investigators wrote. “A more precise link between food and IBS has been demonstrated for gluten and other wheat proteins, lactose, and nickel, highlighting particular subsets of IBS patients now diagnosed as nonceliac gluten/wheat sensitivity, lactose intolerance, and nickel-allergic contact mucositis,” they added. “Here, we evaluated the effect of aluminum, a common contaminant of food and water, on abdominal pain.”

Aluminum may enter the diet as a food additive, or it may contaminate foods grown in aluminum-rich soil. Other sources of oral exposure include packaging and kitchenware. A previous study showed that most Americans ingest 0.01-1.4 mg/kg of aluminum daily, and 5% ingest 1.58 mg/kg daily (i.e., 95 mg per day for a 60-kg person).

Based on these statistics, rats in the present study received oral aluminum citrate (AlCi) corresponding with three doses of aluminum: 0.5 mg/kg, 1.5 mg/kg, or 3.0 mg/kg. Treatment continued for 30 days, with colorectal distension (CRD) measured on days 2, 4, 8, 15, and 30.

Results showed a dose-dependent relationship between aluminum ingestion and visceral hypersensitivity. Within 2 days, rats receiving 3.0 mg/kg of aluminum exhibited a significantly lower pain threshold, and within 8 days, rats receiving 0.5 mg/kg and 1.5 mg/kg also showed increased visceral hypersensitivity.

After 1 month of treatment, rats receiving 1.5 mg/kg per day demonstrated a 30% increase in pain compared with control animals. In the same group, visceral hypersensitivity began to wane 7 days after cessation of treatment; 4 more weeks were needed to return to baseline. When treatment was restarted, visceral hypersensitivity occurred within 2 days, compared with 8 days upon initial administration. These findings are particularly relevant to some people, as the 1.5-mg/kg dose corresponds with the daily amount of aluminum ingested by 5% of Americans. Similar patterns of response and sensitization were observed in rats ingesting 0.5 mg/kg and 3.0 mg/kg. Female rats were more sensitive to aluminum than were male rats, a sex pattern that mimics human IBS.

Further testing showed that rats treated with zinc citrate (ZnCi) did not exhibit changes to pain threshold, thereby excluding citrate as an aggravating factor. Rat models of noninflammatory and inflammatory colonic hypersensitivity (butyrate enema or intrathecal injection of 25%-50% ethanol in combination with 2,4,6-trinitrobenzenesulfonic acid, respectively) had visceral hypersensitivity similar to that of rats in the 1.5-mg/kg AlCi group.

Testing of colonic tissue from AlCi-treated rats did not reveal inflammatory changes according to a variety of qualifiers, including histology, myeloperoxidase activity, mRNA expression of several inflammatory cytokines, or infiltration of eosinophils or macrophages. Noninflammatory effects of aluminum, however, were found. For instance, treated rats had lower serotonin levels in enteroendocrine cells.

“Enteroendocrine cells are specialized epithelial cells that respond to luminal stimuli by releasing various biologically active compounds,” the investigators wrote. “They regulate several physiological and homeostatic functions of the gastrointestinal tract, such as postprandial secretion, motility, immune responses, and sensory functions. A reduced number of enteroendocrine cells has been observed in the duodenum, ileum, and colon of some patients with IBS.”

 

 


In addition to changes in enteroendocrine cells, AlCi-treated rats had greater colonic mast cell degranulation and histamine with upregulation of histidine decarboxylase transcripts, suggesting that aluminum activated mast cells.

To determine the role of mast cell activation in visceral hypersensitivity, rats were given AlCi with cromoglycate, an inhibitor of mast cell degranulation. This treatment reduced mast cell degranulation and visceral pain threshold, compared with AlCi-treated rats not receiving cromoglycate, suggesting that mast cell degranulation is a primary driver of visceral hypersensitivity. This observation was confirmed by a mast cell–deficient mouse strain (Kit W-sh/W-sh), that had a normal number of mast cells incapable of degranulation. Treating the mast cell–deficient mice with AlCi did not induce visceral hypersensitivity, thereby confirming the role of mast cell degranulation.

Along with mast cell degranulation, AlCi treatment led to PAR2 activation. Investigators explored the significance of this finding with PAR2 knockout mice. When treated with AlCi, PAR2 knockout mice showed no increase in visceral hypersensitivity, suggesting that hypersensitivity is dependent on PAR2 activation. Further testing revealed that mast cell–deficient mice (Kit W-sh/W-sh) did not have PAR2 upregulation either, connecting a sequence in which aluminum triggers mast cell degranulation, mast cell degranulation drives PAR2 upregulation, and PAR2 upregulation causes visceral hypersensitivity. The latter two events in this chain – mast cell degranulation and PAR2 upregulation – mirror molecular mechanisms of IBS in humans.

“We speculate that aluminum activates mast cells to release mediators that can increase excitability of nociceptive afferences contributing to the visceral pain phenotype,” the investigators wrote. “Taken together, our results linked aluminum to several mechanisms implicated in IBS pathophysiology, highlighting a possible role for aluminum as a triggering factor in IBS development.”

The investigators suggested that these findings could influence preventive or therapeutic strategies: “Aluminum might be the first identified dietary risk factor for IBS, implying that measures to limit aluminum dietary consumption or to chelate aluminum may represent novel pathways of prevention and treatment of IBS in some susceptible patients,” they wrote.

The study was funded by the European Fund for Regional Economic Development; the Hauts de France Region, Ministère de l’Enseignement Supérieur et de la Recherche (CPER IRENI); and Digestscience (European Research Foundation on Intestinal Diseases and Nutrition).

SOURCE: Esquerre N et al. Cell Mol Gastroenterol Hepatol. 2019 Sep 20. doi: 10.1016/j.jcmgh.2018.09.012.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Aluminum ingestion triggers visceral hypersensitivity in rats, suggesting that dietary levels of aluminum may contribute to development of irritable bowel syndrome in humans.

Major finding: In rodents, 1 month of oral aluminum administration led to a 30% increase in pain during colorectal distension, compared with control subjects.

Study details: A rodent study including noninflammatory and inflammatory IBS rat models, mast cell–deficient mice, and PAR2 knockout mice.

Disclosures: The study was funded by the European Fund for Regional Economic Development; the Hauts de France Region, Ministère de l’Enseignement Supérieur et de la Recherche (CPER IRENI); and Digestscience (European Research Foundation on Intestinal Diseases and Nutrition).

Source: Esquerre N et al. Cell Mol Gastroenterol Hepatol. 2019 Sep 20. doi: 10.1016/j.jcmgh.2018.09.012.

Disqus Comments
Default
Use ProPublica

HPV-16/-18 dramatically increases risk of high-grade CIN in young women

Article Type
Changed
Fri, 01/04/2019 - 14:28

 

Young women with HPV-16/-18 are significantly more likely to develop high-grade cervical intraepithelial neoplasia (CIN), compared with young women who do not have HPV-16/-18, and therefore require close monitoring, according to a 9-year study of more than 500 women.

Specific strain of HPV had less effect on risk in women aged 30 years or older, compared with younger women, reported lead author Maria Fröberg, MD, PhD, of Karolinska University Hospital and Institute in Stockholm and her colleagues.

“With today’s introduction of HPV primary screening into several organized screening programs and with many triage algorithms available, further research is needed to ensure safe follow-up management and prevent the unnecessary treatment of transient positive HPV findings associated with regressive high-grade CIN,” the investigators wrote in Cancer.

To better understand risk associated with HPV, the investigators drew from a database of 9,464 Swedish women who were cytologically negative for cervical intraepithelial lesions or malignancy (NILM) at baseline during 2005-2007. These baseline-negative women were followed for 9 years; during this time, 96 women developed histologically confirmed, high-grade CIN (CIN2, CIN3, cervical cancer, or adenocarcinoma in situ [AIS]). For each case, five age-matched women were selected who did not develop high-grade CIN to make a control cohort of 480 women.

Approximately half of the cases had CIN2 (45.8%), and half had CIN3 or worse histopathology (CIN3+, 54.2%). HPV-16/-18 was more often associated with CIN3+, compared with CIN2 (Pearson x2, 6.12; P less than .02 [2-sided]). Women with high-grade CIN were significantly more likely to have HPV of any strain, compared with controls (odds ratio, 6.78). Women aged younger than 30 years who had HPV-16/-18 at baseline were far more likely to develop high-grade CIN (OR, 9.44) but showed less impact from other strains of HPV (OR, 2.24). In contrast, women aged 30 years or older showed similar increases in high-grade CIN risk when comparing HPV-16/-18 with other strains (OR, 8.16 vs. 9.04).

“These latter findings suggest that genotyping for HPV-16/-18 might be useful for risk stratification among younger women,” the investigators suggested, noting that “further prospective study on this topic is warranted.”

The study was funded by the Swedish Cancer Foundation, the Stockholm County Council, the Swedish Research Council, and the King Gustaf V Jubilee Fund, and the Karolinska Institute. During the study, one investigator received grants from VALGENT and the 7th Framework Programme of DG Research and Innovation (European Commission).

SOURCE: Fröberg M et al. Cancer. 2018 Dec 10. doi: 10.1002/cncr.31788.

Publications
Topics
Sections

 

Young women with HPV-16/-18 are significantly more likely to develop high-grade cervical intraepithelial neoplasia (CIN), compared with young women who do not have HPV-16/-18, and therefore require close monitoring, according to a 9-year study of more than 500 women.

Specific strain of HPV had less effect on risk in women aged 30 years or older, compared with younger women, reported lead author Maria Fröberg, MD, PhD, of Karolinska University Hospital and Institute in Stockholm and her colleagues.

“With today’s introduction of HPV primary screening into several organized screening programs and with many triage algorithms available, further research is needed to ensure safe follow-up management and prevent the unnecessary treatment of transient positive HPV findings associated with regressive high-grade CIN,” the investigators wrote in Cancer.

To better understand risk associated with HPV, the investigators drew from a database of 9,464 Swedish women who were cytologically negative for cervical intraepithelial lesions or malignancy (NILM) at baseline during 2005-2007. These baseline-negative women were followed for 9 years; during this time, 96 women developed histologically confirmed, high-grade CIN (CIN2, CIN3, cervical cancer, or adenocarcinoma in situ [AIS]). For each case, five age-matched women were selected who did not develop high-grade CIN to make a control cohort of 480 women.

Approximately half of the cases had CIN2 (45.8%), and half had CIN3 or worse histopathology (CIN3+, 54.2%). HPV-16/-18 was more often associated with CIN3+, compared with CIN2 (Pearson x2, 6.12; P less than .02 [2-sided]). Women with high-grade CIN were significantly more likely to have HPV of any strain, compared with controls (odds ratio, 6.78). Women aged younger than 30 years who had HPV-16/-18 at baseline were far more likely to develop high-grade CIN (OR, 9.44) but showed less impact from other strains of HPV (OR, 2.24). In contrast, women aged 30 years or older showed similar increases in high-grade CIN risk when comparing HPV-16/-18 with other strains (OR, 8.16 vs. 9.04).

“These latter findings suggest that genotyping for HPV-16/-18 might be useful for risk stratification among younger women,” the investigators suggested, noting that “further prospective study on this topic is warranted.”

The study was funded by the Swedish Cancer Foundation, the Stockholm County Council, the Swedish Research Council, and the King Gustaf V Jubilee Fund, and the Karolinska Institute. During the study, one investigator received grants from VALGENT and the 7th Framework Programme of DG Research and Innovation (European Commission).

SOURCE: Fröberg M et al. Cancer. 2018 Dec 10. doi: 10.1002/cncr.31788.

 

Young women with HPV-16/-18 are significantly more likely to develop high-grade cervical intraepithelial neoplasia (CIN), compared with young women who do not have HPV-16/-18, and therefore require close monitoring, according to a 9-year study of more than 500 women.

Specific strain of HPV had less effect on risk in women aged 30 years or older, compared with younger women, reported lead author Maria Fröberg, MD, PhD, of Karolinska University Hospital and Institute in Stockholm and her colleagues.

“With today’s introduction of HPV primary screening into several organized screening programs and with many triage algorithms available, further research is needed to ensure safe follow-up management and prevent the unnecessary treatment of transient positive HPV findings associated with regressive high-grade CIN,” the investigators wrote in Cancer.

To better understand risk associated with HPV, the investigators drew from a database of 9,464 Swedish women who were cytologically negative for cervical intraepithelial lesions or malignancy (NILM) at baseline during 2005-2007. These baseline-negative women were followed for 9 years; during this time, 96 women developed histologically confirmed, high-grade CIN (CIN2, CIN3, cervical cancer, or adenocarcinoma in situ [AIS]). For each case, five age-matched women were selected who did not develop high-grade CIN to make a control cohort of 480 women.

Approximately half of the cases had CIN2 (45.8%), and half had CIN3 or worse histopathology (CIN3+, 54.2%). HPV-16/-18 was more often associated with CIN3+, compared with CIN2 (Pearson x2, 6.12; P less than .02 [2-sided]). Women with high-grade CIN were significantly more likely to have HPV of any strain, compared with controls (odds ratio, 6.78). Women aged younger than 30 years who had HPV-16/-18 at baseline were far more likely to develop high-grade CIN (OR, 9.44) but showed less impact from other strains of HPV (OR, 2.24). In contrast, women aged 30 years or older showed similar increases in high-grade CIN risk when comparing HPV-16/-18 with other strains (OR, 8.16 vs. 9.04).

“These latter findings suggest that genotyping for HPV-16/-18 might be useful for risk stratification among younger women,” the investigators suggested, noting that “further prospective study on this topic is warranted.”

The study was funded by the Swedish Cancer Foundation, the Stockholm County Council, the Swedish Research Council, and the King Gustaf V Jubilee Fund, and the Karolinska Institute. During the study, one investigator received grants from VALGENT and the 7th Framework Programme of DG Research and Innovation (European Commission).

SOURCE: Fröberg M et al. Cancer. 2018 Dec 10. doi: 10.1002/cncr.31788.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CANCER

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Women with HPV-16/-18 are at significantly higher risk of high-grade cervical intraepithelial neoplasia (CIN), compared with women without HPV-16/-18, and therefore require close monitoring.

Major finding: Women younger than 30 years who test positive for HPV-16/-18 are almost 10 times as likely to develop high-grade CIN, compared with young women negative for HPV-16/-18 (odds ratio, 9.44).

Study details: A nested case-control study involving 96 women who developed high-grade CIN over the 9-year study period, compared with 480 age-matched controls who did not develop cervical lesions.

Disclosures: The study was funded by the Swedish Cancer Foundation, the Stockholm County Council, the Swedish Research Council, and the King Gustaf V Jubilee Fund, and the Karolinska Institute. During the study, one investigator received grants from VALGENT and the 7th Framework Programme of DG Research and Innovation (European Commission).

Source: Fröberg M et al. Cancer. 2018 Dec 10. doi: 10.1002/cncr.31788.

Disqus Comments
Default
Use ProPublica

Biomarker algorithm may offer noninvasive look at liver fibrosis

Article Type
Changed
Fri, 01/18/2019 - 18:11

 

Serum biomarkers may enable a noninvasive method of detecting advanced hepatic fibrosis in patients with nonalcoholic fatty liver disease (NAFLD), according to results from a recent study.

Nephron/Wikimedia/Creative Commons License

An algorithm created by the investigators distinguished NAFLD patients with advanced liver fibrosis from those with mild to moderate fibrosis, reported lead author Rohit Loomba, MD, of the University of California at San Diego and his colleagues.

“Liver biopsy is currently the gold standard for diagnosing NASH [nonalcoholic steatohepatitis] and staging liver fibrosis,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, it is a costly and invasive procedure with an all-cause mortality risk of approximately 0.2%. Liver biopsy typically samples only 1/50,000th of the organ, and it is liable to sampling error with an error rate of 25% for diagnosis of hepatic fibrosis.”

Existing serum-based tests are reliable for diagnosing nonfibrotic NAFLD, but they may misdiagnosis patients with advanced fibrosis. Although imaging-based techniques may provide better diagnostic accuracy, some are available only for subgroups of patients, while others come with a high financial burden. Diagnostic shortcomings may have a major effect on patient outcomes, particularly when risk groups are considered.

“Fibrosis stages F3 and F4 (advanced fibrosis) are primary predictors of liver-related morbidity and mortality, with 11%-22% of NASH patients reported to have advanced fibrosis,” the investigators noted.

The investigators therefore aimed to distinguish such high-risk NAFLD patients from those with mild or moderate liver fibrosis. Three biomarkers were included: hyaluronic acid (HA), TIMP metallopeptidase inhibitor 1 (TIMP-1), and alpha2-macroglobulin (A2M). Each biomarker has documented associations with liver fibrosis. For instance, higher A2M concentrations inhibit fibrinolysis, HA is associated with excessive extracellular matrix and fibrotic tissue, and TIMP-1 is a known liver fibrosis marker and inhibitor of extracellular matrix degradation. The relative strengths of each in detecting advanced liver fibrosis was determined through an algorithm.

The investigators relied on archived serum samples from Duke University, Durham, N.C., (n = 792) and University of California at San Diego (n = 244) that were collected within 11 days of liver biopsy. Biopsies were performed with 15- to 16-gauge needles using at least eight portal tracts, and these samples were used to diagnose NAFLD. Patients with alcoholic liver disease or hepatitis C virus were excluded.

Algorithm training was based on serum measurements from 396 patients treated at Duke University. Samples were divided into mild to moderate (F0-F2) or advanced (F3-F4) fibrosis and split into 10 subsets. The logical regression model was trained on nine subsets and tested on the 10th, with iterations 10 times through this sequence until all 10 samples were tested. This process was repeated 10,000 times. Using the median coefficients from 100,000 logistical regression models, the samples were scored using the algorithm from 0 to 100, with higher numbers representing more advanced fibrosis, and the relative weights of each biomarker measurement were determined.

A noninferiority protocol was used to validate the algorithm, through which the area under the receiver operating characteristic (AUROC) curve was calculated. The AUROC curve of the validation samples was 0.856, with 0.5 being the score for a random algorithm. The algorithm correctly classified 90.0% of F0 cases, 75.0% of F1 cases, 53.8% of F2 cases, 77.4% of F3 cases, and 94.4% of F4 cases. The sensitivity was 79.7% and the specificity was 75.7%.

The algorithm was superior to Fibrosis-4 (FIB-4) and NAFLD Fibrosis Score (NFS) in two validation cohorts. In a combination of validation cohorts, the algorithm correctly identified 79.5% of F3-F4 patients, compared with rates of 25.8% and 28.0% from FIB-4 and NFS, respectively. The investigators noted that the algorithm was unaffected by sex or age. In contrast, FIB-4 is biased toward females, and both FIB-4 and NFS are less accurate with patients aged 35 years or younger.

“Performance of the training and validation sets was robust and well matched, enabling the reliable differentiation of NAFLD patients with and without advanced fibrosis,” the investigators concluded.

The study was supported by Prometheus Laboratories. Authors not employed by Prometheus Laboratories were employed by Duke University or the University of California, San Diego; each institution received funding from Prometheus Laboratories.

SOURCE: Loomba R et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.004.

Publications
Topics
Sections

 

Serum biomarkers may enable a noninvasive method of detecting advanced hepatic fibrosis in patients with nonalcoholic fatty liver disease (NAFLD), according to results from a recent study.

Nephron/Wikimedia/Creative Commons License

An algorithm created by the investigators distinguished NAFLD patients with advanced liver fibrosis from those with mild to moderate fibrosis, reported lead author Rohit Loomba, MD, of the University of California at San Diego and his colleagues.

“Liver biopsy is currently the gold standard for diagnosing NASH [nonalcoholic steatohepatitis] and staging liver fibrosis,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, it is a costly and invasive procedure with an all-cause mortality risk of approximately 0.2%. Liver biopsy typically samples only 1/50,000th of the organ, and it is liable to sampling error with an error rate of 25% for diagnosis of hepatic fibrosis.”

Existing serum-based tests are reliable for diagnosing nonfibrotic NAFLD, but they may misdiagnosis patients with advanced fibrosis. Although imaging-based techniques may provide better diagnostic accuracy, some are available only for subgroups of patients, while others come with a high financial burden. Diagnostic shortcomings may have a major effect on patient outcomes, particularly when risk groups are considered.

“Fibrosis stages F3 and F4 (advanced fibrosis) are primary predictors of liver-related morbidity and mortality, with 11%-22% of NASH patients reported to have advanced fibrosis,” the investigators noted.

The investigators therefore aimed to distinguish such high-risk NAFLD patients from those with mild or moderate liver fibrosis. Three biomarkers were included: hyaluronic acid (HA), TIMP metallopeptidase inhibitor 1 (TIMP-1), and alpha2-macroglobulin (A2M). Each biomarker has documented associations with liver fibrosis. For instance, higher A2M concentrations inhibit fibrinolysis, HA is associated with excessive extracellular matrix and fibrotic tissue, and TIMP-1 is a known liver fibrosis marker and inhibitor of extracellular matrix degradation. The relative strengths of each in detecting advanced liver fibrosis was determined through an algorithm.

The investigators relied on archived serum samples from Duke University, Durham, N.C., (n = 792) and University of California at San Diego (n = 244) that were collected within 11 days of liver biopsy. Biopsies were performed with 15- to 16-gauge needles using at least eight portal tracts, and these samples were used to diagnose NAFLD. Patients with alcoholic liver disease or hepatitis C virus were excluded.

Algorithm training was based on serum measurements from 396 patients treated at Duke University. Samples were divided into mild to moderate (F0-F2) or advanced (F3-F4) fibrosis and split into 10 subsets. The logical regression model was trained on nine subsets and tested on the 10th, with iterations 10 times through this sequence until all 10 samples were tested. This process was repeated 10,000 times. Using the median coefficients from 100,000 logistical regression models, the samples were scored using the algorithm from 0 to 100, with higher numbers representing more advanced fibrosis, and the relative weights of each biomarker measurement were determined.

A noninferiority protocol was used to validate the algorithm, through which the area under the receiver operating characteristic (AUROC) curve was calculated. The AUROC curve of the validation samples was 0.856, with 0.5 being the score for a random algorithm. The algorithm correctly classified 90.0% of F0 cases, 75.0% of F1 cases, 53.8% of F2 cases, 77.4% of F3 cases, and 94.4% of F4 cases. The sensitivity was 79.7% and the specificity was 75.7%.

The algorithm was superior to Fibrosis-4 (FIB-4) and NAFLD Fibrosis Score (NFS) in two validation cohorts. In a combination of validation cohorts, the algorithm correctly identified 79.5% of F3-F4 patients, compared with rates of 25.8% and 28.0% from FIB-4 and NFS, respectively. The investigators noted that the algorithm was unaffected by sex or age. In contrast, FIB-4 is biased toward females, and both FIB-4 and NFS are less accurate with patients aged 35 years or younger.

“Performance of the training and validation sets was robust and well matched, enabling the reliable differentiation of NAFLD patients with and without advanced fibrosis,” the investigators concluded.

The study was supported by Prometheus Laboratories. Authors not employed by Prometheus Laboratories were employed by Duke University or the University of California, San Diego; each institution received funding from Prometheus Laboratories.

SOURCE: Loomba R et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.004.

 

Serum biomarkers may enable a noninvasive method of detecting advanced hepatic fibrosis in patients with nonalcoholic fatty liver disease (NAFLD), according to results from a recent study.

Nephron/Wikimedia/Creative Commons License

An algorithm created by the investigators distinguished NAFLD patients with advanced liver fibrosis from those with mild to moderate fibrosis, reported lead author Rohit Loomba, MD, of the University of California at San Diego and his colleagues.

“Liver biopsy is currently the gold standard for diagnosing NASH [nonalcoholic steatohepatitis] and staging liver fibrosis,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, it is a costly and invasive procedure with an all-cause mortality risk of approximately 0.2%. Liver biopsy typically samples only 1/50,000th of the organ, and it is liable to sampling error with an error rate of 25% for diagnosis of hepatic fibrosis.”

Existing serum-based tests are reliable for diagnosing nonfibrotic NAFLD, but they may misdiagnosis patients with advanced fibrosis. Although imaging-based techniques may provide better diagnostic accuracy, some are available only for subgroups of patients, while others come with a high financial burden. Diagnostic shortcomings may have a major effect on patient outcomes, particularly when risk groups are considered.

“Fibrosis stages F3 and F4 (advanced fibrosis) are primary predictors of liver-related morbidity and mortality, with 11%-22% of NASH patients reported to have advanced fibrosis,” the investigators noted.

The investigators therefore aimed to distinguish such high-risk NAFLD patients from those with mild or moderate liver fibrosis. Three biomarkers were included: hyaluronic acid (HA), TIMP metallopeptidase inhibitor 1 (TIMP-1), and alpha2-macroglobulin (A2M). Each biomarker has documented associations with liver fibrosis. For instance, higher A2M concentrations inhibit fibrinolysis, HA is associated with excessive extracellular matrix and fibrotic tissue, and TIMP-1 is a known liver fibrosis marker and inhibitor of extracellular matrix degradation. The relative strengths of each in detecting advanced liver fibrosis was determined through an algorithm.

The investigators relied on archived serum samples from Duke University, Durham, N.C., (n = 792) and University of California at San Diego (n = 244) that were collected within 11 days of liver biopsy. Biopsies were performed with 15- to 16-gauge needles using at least eight portal tracts, and these samples were used to diagnose NAFLD. Patients with alcoholic liver disease or hepatitis C virus were excluded.

Algorithm training was based on serum measurements from 396 patients treated at Duke University. Samples were divided into mild to moderate (F0-F2) or advanced (F3-F4) fibrosis and split into 10 subsets. The logical regression model was trained on nine subsets and tested on the 10th, with iterations 10 times through this sequence until all 10 samples were tested. This process was repeated 10,000 times. Using the median coefficients from 100,000 logistical regression models, the samples were scored using the algorithm from 0 to 100, with higher numbers representing more advanced fibrosis, and the relative weights of each biomarker measurement were determined.

A noninferiority protocol was used to validate the algorithm, through which the area under the receiver operating characteristic (AUROC) curve was calculated. The AUROC curve of the validation samples was 0.856, with 0.5 being the score for a random algorithm. The algorithm correctly classified 90.0% of F0 cases, 75.0% of F1 cases, 53.8% of F2 cases, 77.4% of F3 cases, and 94.4% of F4 cases. The sensitivity was 79.7% and the specificity was 75.7%.

The algorithm was superior to Fibrosis-4 (FIB-4) and NAFLD Fibrosis Score (NFS) in two validation cohorts. In a combination of validation cohorts, the algorithm correctly identified 79.5% of F3-F4 patients, compared with rates of 25.8% and 28.0% from FIB-4 and NFS, respectively. The investigators noted that the algorithm was unaffected by sex or age. In contrast, FIB-4 is biased toward females, and both FIB-4 and NFS are less accurate with patients aged 35 years or younger.

“Performance of the training and validation sets was robust and well matched, enabling the reliable differentiation of NAFLD patients with and without advanced fibrosis,” the investigators concluded.

The study was supported by Prometheus Laboratories. Authors not employed by Prometheus Laboratories were employed by Duke University or the University of California, San Diego; each institution received funding from Prometheus Laboratories.

SOURCE: Loomba R et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.004.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: A serum biomarker–based algorithm may provide a noninvasive method of detecting advanced hepatic fibrosis in patients with nonalcoholic fatty liver disease (NAFLD).

Major finding: The area under the receiver operator characteristic (AUROC) curve for a combination of validation samples was 0.856.

Study details: A retrospective study of liver fibrosis serum markers and clinical data from 396 patients with NAFLD and various stages of fibrosis.

Disclosures: The study was supported by Prometheus Laboratories. Authors not employed by Prometheus Laboratories were employed by Duke University or the University of California, San Diego; each institution received funding from Prometheus Laboratories.

Source: Loomba R et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.004.

Disqus Comments
Default
Use ProPublica

‘Error neuron’ EEG findings could open up future clinical applications

Article Type
Changed
Mon, 01/07/2019 - 10:51

 

Single neurons in the human medial frontal cortex appear to be involved in the signaling of self-monitored errors, and this activity can be tracked through a scalp EEG pattern called error-related negativity, according to findings from experiments carried out during intracranial EEG recordings of candidates for surgical treatment of epilepsy.

Epifantsev/Thinkstock

“Our results suggest that coordinated neural activity can serve as a substrate for information routing that enables the performance-monitoring system to communicate the need for behavioral control to other brain regions, including those that maintain flexible goal information, such as the lateral prefrontal cortex and the frontal polar cortex,” first author Zhongzheng Fu, a PhD student at the California Institute of Technology in Pasadena, Calif., and Cedars-Sinai Medical Center, Los Angeles, and his colleagues reported in Neuron.

The findings offer insights that could lead to treatments for conditions in which the important executive function task of error self-monitoring is unbalanced, such as obsessive-compulsive disorder and schizophrenia, the authors noted in a press release.

“We discovered that the activity of error neurons correlates with the size of the ERN [error-related negativity],” Mr. Fu said. “This identifies the brain area that causes the ERN and helps explain what it signifies. This new insight might allow doctors to use the ERN as a standard tool to diagnose mental diseases and monitor responses to treatment.”

Error neuron firing and intracranial ERN occurred first in pre-supplementary motor area (pre-SMA), then in the dorsal anterior cingulate cortex (dACC) about 50 ms later, with significant correlations between firing and intracranial ERN in both locations. In dACC, this activity, with error-integrating neuron responses, correlated with magnitude of post-error slowing (PES).

Previous research suggested a link between “the detection of self-generated errors, as reflected in the ERN, with changes in cognitive control, as exhibited behaviorally in PES,” the investigators wrote. “However, several electroencephalogram (EEG) studies have failed to find a significant relationship between PES and ERN.”

The present study involved intracranial EEG of 29 candidates for surgical treatment of epilepsy and scalp EEG of 12 control participants, with each modality measuring activity in the frontal cortex. Both cohorts performed a rapid version of the color-word Stroop task, in which the words “red,” “green,” or “blue” were printed either in corresponding or noncorresponding colors of red, green, or blue. Subjects were presented various color-word combinations while being asked to click one of three buttons indicating the color of the word as quickly as possible. The investigators monitored neuronal activity throughout, discarding responses that were too slow.

As found in previous trials, the subjects demonstrated the “Stroop effect,” which refers to a slower response when word and color are incongruent (224.9 ms difference; P less than .001). As anticipated, correct responses following correct responses were faster than were correct responses following erroneous responses, which defines PES.

In the intracranial EEG group, the investigators isolated 1,171 neurons, of which 618 were located in dACC and 553 in pre-SMA. Using a Poisson regression model and correlations with erroneous responses, the investigators identified 99 “type I” error neurons in dACC and 118 in pre-SMA, based on higher frequency of firing during erroneous responses than during correct responses. At a single-cell level, error neuron mean spike rates were highest when intracranial ERN amplitude was greatest, such that error neuron firing in dACC and pre-SMA had maximal likelihood ratios of 7.9 (P = .01) and 15.1 (P less than .001), respectively. The strength of correlation between intracranial ERN and error neuron firing rate was directly related to PES magnitude exclusively in the dACC (maximum likelihood ratio of 13.9; P = .015). In post-error trials, faster error-integrating neuron firing rates in dACC predicted greater PES (maximal likelihood ratio of 18.3; P less than .001).

The study was funded by the National Institutes of Health, the McKnight Endowment for Neuroscience, and the National Science Foundation. The authors declared no conflicts of interest.

SOURCE: Fu Z et al. Neuron. 2018 Dec 4. doi: 10.1016/j.neuron.2018.11.016

Publications
Topics
Sections

 

Single neurons in the human medial frontal cortex appear to be involved in the signaling of self-monitored errors, and this activity can be tracked through a scalp EEG pattern called error-related negativity, according to findings from experiments carried out during intracranial EEG recordings of candidates for surgical treatment of epilepsy.

Epifantsev/Thinkstock

“Our results suggest that coordinated neural activity can serve as a substrate for information routing that enables the performance-monitoring system to communicate the need for behavioral control to other brain regions, including those that maintain flexible goal information, such as the lateral prefrontal cortex and the frontal polar cortex,” first author Zhongzheng Fu, a PhD student at the California Institute of Technology in Pasadena, Calif., and Cedars-Sinai Medical Center, Los Angeles, and his colleagues reported in Neuron.

The findings offer insights that could lead to treatments for conditions in which the important executive function task of error self-monitoring is unbalanced, such as obsessive-compulsive disorder and schizophrenia, the authors noted in a press release.

“We discovered that the activity of error neurons correlates with the size of the ERN [error-related negativity],” Mr. Fu said. “This identifies the brain area that causes the ERN and helps explain what it signifies. This new insight might allow doctors to use the ERN as a standard tool to diagnose mental diseases and monitor responses to treatment.”

Error neuron firing and intracranial ERN occurred first in pre-supplementary motor area (pre-SMA), then in the dorsal anterior cingulate cortex (dACC) about 50 ms later, with significant correlations between firing and intracranial ERN in both locations. In dACC, this activity, with error-integrating neuron responses, correlated with magnitude of post-error slowing (PES).

Previous research suggested a link between “the detection of self-generated errors, as reflected in the ERN, with changes in cognitive control, as exhibited behaviorally in PES,” the investigators wrote. “However, several electroencephalogram (EEG) studies have failed to find a significant relationship between PES and ERN.”

The present study involved intracranial EEG of 29 candidates for surgical treatment of epilepsy and scalp EEG of 12 control participants, with each modality measuring activity in the frontal cortex. Both cohorts performed a rapid version of the color-word Stroop task, in which the words “red,” “green,” or “blue” were printed either in corresponding or noncorresponding colors of red, green, or blue. Subjects were presented various color-word combinations while being asked to click one of three buttons indicating the color of the word as quickly as possible. The investigators monitored neuronal activity throughout, discarding responses that were too slow.

As found in previous trials, the subjects demonstrated the “Stroop effect,” which refers to a slower response when word and color are incongruent (224.9 ms difference; P less than .001). As anticipated, correct responses following correct responses were faster than were correct responses following erroneous responses, which defines PES.

In the intracranial EEG group, the investigators isolated 1,171 neurons, of which 618 were located in dACC and 553 in pre-SMA. Using a Poisson regression model and correlations with erroneous responses, the investigators identified 99 “type I” error neurons in dACC and 118 in pre-SMA, based on higher frequency of firing during erroneous responses than during correct responses. At a single-cell level, error neuron mean spike rates were highest when intracranial ERN amplitude was greatest, such that error neuron firing in dACC and pre-SMA had maximal likelihood ratios of 7.9 (P = .01) and 15.1 (P less than .001), respectively. The strength of correlation between intracranial ERN and error neuron firing rate was directly related to PES magnitude exclusively in the dACC (maximum likelihood ratio of 13.9; P = .015). In post-error trials, faster error-integrating neuron firing rates in dACC predicted greater PES (maximal likelihood ratio of 18.3; P less than .001).

The study was funded by the National Institutes of Health, the McKnight Endowment for Neuroscience, and the National Science Foundation. The authors declared no conflicts of interest.

SOURCE: Fu Z et al. Neuron. 2018 Dec 4. doi: 10.1016/j.neuron.2018.11.016

 

Single neurons in the human medial frontal cortex appear to be involved in the signaling of self-monitored errors, and this activity can be tracked through a scalp EEG pattern called error-related negativity, according to findings from experiments carried out during intracranial EEG recordings of candidates for surgical treatment of epilepsy.

Epifantsev/Thinkstock

“Our results suggest that coordinated neural activity can serve as a substrate for information routing that enables the performance-monitoring system to communicate the need for behavioral control to other brain regions, including those that maintain flexible goal information, such as the lateral prefrontal cortex and the frontal polar cortex,” first author Zhongzheng Fu, a PhD student at the California Institute of Technology in Pasadena, Calif., and Cedars-Sinai Medical Center, Los Angeles, and his colleagues reported in Neuron.

The findings offer insights that could lead to treatments for conditions in which the important executive function task of error self-monitoring is unbalanced, such as obsessive-compulsive disorder and schizophrenia, the authors noted in a press release.

“We discovered that the activity of error neurons correlates with the size of the ERN [error-related negativity],” Mr. Fu said. “This identifies the brain area that causes the ERN and helps explain what it signifies. This new insight might allow doctors to use the ERN as a standard tool to diagnose mental diseases and monitor responses to treatment.”

Error neuron firing and intracranial ERN occurred first in pre-supplementary motor area (pre-SMA), then in the dorsal anterior cingulate cortex (dACC) about 50 ms later, with significant correlations between firing and intracranial ERN in both locations. In dACC, this activity, with error-integrating neuron responses, correlated with magnitude of post-error slowing (PES).

Previous research suggested a link between “the detection of self-generated errors, as reflected in the ERN, with changes in cognitive control, as exhibited behaviorally in PES,” the investigators wrote. “However, several electroencephalogram (EEG) studies have failed to find a significant relationship between PES and ERN.”

The present study involved intracranial EEG of 29 candidates for surgical treatment of epilepsy and scalp EEG of 12 control participants, with each modality measuring activity in the frontal cortex. Both cohorts performed a rapid version of the color-word Stroop task, in which the words “red,” “green,” or “blue” were printed either in corresponding or noncorresponding colors of red, green, or blue. Subjects were presented various color-word combinations while being asked to click one of three buttons indicating the color of the word as quickly as possible. The investigators monitored neuronal activity throughout, discarding responses that were too slow.

As found in previous trials, the subjects demonstrated the “Stroop effect,” which refers to a slower response when word and color are incongruent (224.9 ms difference; P less than .001). As anticipated, correct responses following correct responses were faster than were correct responses following erroneous responses, which defines PES.

In the intracranial EEG group, the investigators isolated 1,171 neurons, of which 618 were located in dACC and 553 in pre-SMA. Using a Poisson regression model and correlations with erroneous responses, the investigators identified 99 “type I” error neurons in dACC and 118 in pre-SMA, based on higher frequency of firing during erroneous responses than during correct responses. At a single-cell level, error neuron mean spike rates were highest when intracranial ERN amplitude was greatest, such that error neuron firing in dACC and pre-SMA had maximal likelihood ratios of 7.9 (P = .01) and 15.1 (P less than .001), respectively. The strength of correlation between intracranial ERN and error neuron firing rate was directly related to PES magnitude exclusively in the dACC (maximum likelihood ratio of 13.9; P = .015). In post-error trials, faster error-integrating neuron firing rates in dACC predicted greater PES (maximal likelihood ratio of 18.3; P less than .001).

The study was funded by the National Institutes of Health, the McKnight Endowment for Neuroscience, and the National Science Foundation. The authors declared no conflicts of interest.

SOURCE: Fu Z et al. Neuron. 2018 Dec 4. doi: 10.1016/j.neuron.2018.11.016

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM NEURON

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

IgA vasculitis increases risks for hypertension, chronic kidney disease

Article Type
Changed
Wed, 12/12/2018 - 11:58

IgA vasculitis, also called Henoch-Schönlein purpura, increases risks for hypertension and chronic kidney disease (CKD), according to a retrospective study of more than 13,000 patients with IgAV.

In patients with adult-onset IgA vasculitis (IgAV), mortality risk is also increased, reported first author Alexander Tracy and his colleagues at the University of Birmingham (England).

“Long-term health outcomes of adult-onset IgAV are not well characterized,” the investigators wrote in Annals of Rheumatic Disease. “Most evidence regarding complications of IgAV in adults derives from case reports and case series; there is need for controlled epidemiological studies to address this question.”

The retrospective study compared 2,828 patients with adult-onset IgAV and 10,405 patients with childhood-onset IgAV against sex- and age-matched controls. Patients diagnosed at age 16 years or older were classified as having adult-onset disease. The investigators drew their data from The Health Improvement Network database, which includes 3.6 million active patients from more than 675 general practices in the United Kingdom. Patients in the present study were diagnosed with IgAV between 2005 and 2016. After diagnosis, participant follow-up continued until any of the following occurred: outcome event, patient left practice, death, the practice stopped contributing data, or the study ended. Primary outcomes for adult-onset patients were venous thromboembolism (VTE), ischemic heart disease, hypertension, stage 3-5 CKD, stroke/transient ischemic attack, and all-cause mortality. Primary outcomes for patients with childhood-onset disease were limited to CKD, hypertension, and VTE.

The incidence of childhood-onset IgAV was 27.22 per 100,000 person-years, whereas adult-onset disease was much less common at 2.20 per 100,000 person-years. Mean age at onset of childhood IgAV was 6.68 years. The adult-onset group had a mean age at diagnosis of 38.1 years.


Compared with controls, all patients with IgAV, regardless of onset age, had increased risks of hypertension (adult-onset adjusted hazard ratio, 1.42; P less than .001; childhood-onset aHR, 1.52; P less than .001) and CKD (adult-onset aHR, 1.54; P less than .001; childhood-onset aHR, 1.89; P = .01). Patients with adult-onset IgAV showed increased risk of death, compared with controls (aHR, 1.27; P = .006). No associations were found between IgAV and stroke/transient ischemic attack, VTE, or ischemic heart disease.

“These findings emphasize the importance of blood pressure and renal function monitoring in patients with IgAV,” the investigators concluded. “Our data also suggest that IgAV should not be considered a ‘single-hit’ disease, but that clinicians should monitor for long-term sequelae. Further research is required to clarify the cause of hypertension in patients with IgAV, and to investigate whether such patients suffer from additional long-term sequelae than that are currently unrecognized.”

The investigators reported no funding sources or conflicts or interest.

SOURCE: Tracy A et al. Ann Rheum Dis. 2018 Nov 28. doi: 10.1136/annrheumdis-2018-214142.

Publications
Topics
Sections

IgA vasculitis, also called Henoch-Schönlein purpura, increases risks for hypertension and chronic kidney disease (CKD), according to a retrospective study of more than 13,000 patients with IgAV.

In patients with adult-onset IgA vasculitis (IgAV), mortality risk is also increased, reported first author Alexander Tracy and his colleagues at the University of Birmingham (England).

“Long-term health outcomes of adult-onset IgAV are not well characterized,” the investigators wrote in Annals of Rheumatic Disease. “Most evidence regarding complications of IgAV in adults derives from case reports and case series; there is need for controlled epidemiological studies to address this question.”

The retrospective study compared 2,828 patients with adult-onset IgAV and 10,405 patients with childhood-onset IgAV against sex- and age-matched controls. Patients diagnosed at age 16 years or older were classified as having adult-onset disease. The investigators drew their data from The Health Improvement Network database, which includes 3.6 million active patients from more than 675 general practices in the United Kingdom. Patients in the present study were diagnosed with IgAV between 2005 and 2016. After diagnosis, participant follow-up continued until any of the following occurred: outcome event, patient left practice, death, the practice stopped contributing data, or the study ended. Primary outcomes for adult-onset patients were venous thromboembolism (VTE), ischemic heart disease, hypertension, stage 3-5 CKD, stroke/transient ischemic attack, and all-cause mortality. Primary outcomes for patients with childhood-onset disease were limited to CKD, hypertension, and VTE.

The incidence of childhood-onset IgAV was 27.22 per 100,000 person-years, whereas adult-onset disease was much less common at 2.20 per 100,000 person-years. Mean age at onset of childhood IgAV was 6.68 years. The adult-onset group had a mean age at diagnosis of 38.1 years.


Compared with controls, all patients with IgAV, regardless of onset age, had increased risks of hypertension (adult-onset adjusted hazard ratio, 1.42; P less than .001; childhood-onset aHR, 1.52; P less than .001) and CKD (adult-onset aHR, 1.54; P less than .001; childhood-onset aHR, 1.89; P = .01). Patients with adult-onset IgAV showed increased risk of death, compared with controls (aHR, 1.27; P = .006). No associations were found between IgAV and stroke/transient ischemic attack, VTE, or ischemic heart disease.

“These findings emphasize the importance of blood pressure and renal function monitoring in patients with IgAV,” the investigators concluded. “Our data also suggest that IgAV should not be considered a ‘single-hit’ disease, but that clinicians should monitor for long-term sequelae. Further research is required to clarify the cause of hypertension in patients with IgAV, and to investigate whether such patients suffer from additional long-term sequelae than that are currently unrecognized.”

The investigators reported no funding sources or conflicts or interest.

SOURCE: Tracy A et al. Ann Rheum Dis. 2018 Nov 28. doi: 10.1136/annrheumdis-2018-214142.

IgA vasculitis, also called Henoch-Schönlein purpura, increases risks for hypertension and chronic kidney disease (CKD), according to a retrospective study of more than 13,000 patients with IgAV.

In patients with adult-onset IgA vasculitis (IgAV), mortality risk is also increased, reported first author Alexander Tracy and his colleagues at the University of Birmingham (England).

“Long-term health outcomes of adult-onset IgAV are not well characterized,” the investigators wrote in Annals of Rheumatic Disease. “Most evidence regarding complications of IgAV in adults derives from case reports and case series; there is need for controlled epidemiological studies to address this question.”

The retrospective study compared 2,828 patients with adult-onset IgAV and 10,405 patients with childhood-onset IgAV against sex- and age-matched controls. Patients diagnosed at age 16 years or older were classified as having adult-onset disease. The investigators drew their data from The Health Improvement Network database, which includes 3.6 million active patients from more than 675 general practices in the United Kingdom. Patients in the present study were diagnosed with IgAV between 2005 and 2016. After diagnosis, participant follow-up continued until any of the following occurred: outcome event, patient left practice, death, the practice stopped contributing data, or the study ended. Primary outcomes for adult-onset patients were venous thromboembolism (VTE), ischemic heart disease, hypertension, stage 3-5 CKD, stroke/transient ischemic attack, and all-cause mortality. Primary outcomes for patients with childhood-onset disease were limited to CKD, hypertension, and VTE.

The incidence of childhood-onset IgAV was 27.22 per 100,000 person-years, whereas adult-onset disease was much less common at 2.20 per 100,000 person-years. Mean age at onset of childhood IgAV was 6.68 years. The adult-onset group had a mean age at diagnosis of 38.1 years.


Compared with controls, all patients with IgAV, regardless of onset age, had increased risks of hypertension (adult-onset adjusted hazard ratio, 1.42; P less than .001; childhood-onset aHR, 1.52; P less than .001) and CKD (adult-onset aHR, 1.54; P less than .001; childhood-onset aHR, 1.89; P = .01). Patients with adult-onset IgAV showed increased risk of death, compared with controls (aHR, 1.27; P = .006). No associations were found between IgAV and stroke/transient ischemic attack, VTE, or ischemic heart disease.

“These findings emphasize the importance of blood pressure and renal function monitoring in patients with IgAV,” the investigators concluded. “Our data also suggest that IgAV should not be considered a ‘single-hit’ disease, but that clinicians should monitor for long-term sequelae. Further research is required to clarify the cause of hypertension in patients with IgAV, and to investigate whether such patients suffer from additional long-term sequelae than that are currently unrecognized.”

The investigators reported no funding sources or conflicts or interest.

SOURCE: Tracy A et al. Ann Rheum Dis. 2018 Nov 28. doi: 10.1136/annrheumdis-2018-214142.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM ANNALS OF THE RHEUMATIC DISEASES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

Key clinical point: IgA vasculitis increases risks of hypertension and chronic kidney disease in all patients and increases risk of death in patients with adult-onset disease.

Major finding: There was significantly increased risk of stage 3-5 chronic kidney disease in patients with childhood-onset IgA vasculitis (adjusted hazard ratio, 1.89; P = .01).

Study details: A retrospective study of 2,828 patients with adult-onset IgA vasculitis and 10,405 patients with childhood-onset IgAV, compared with sex-matched and age-matched controls.

Disclosures: No funding sources or conflicts of interest were reported.

Source: Tracy A et al. Ann Rheum Dis. 2018 Nov 28. doi: 10.1136/annrheumdis-2018-214142.

Disqus Comments
Default
Use ProPublica

Proposed neuroblastoma classification scheme hinges on telomere maintenance mechanisms

Article Type
Changed
Fri, 12/11/2020 - 11:21

 

Telomere maintenance mechanisms, RAS mutations, and p53 mutations can be used to mechanistically classify clinical phenotypes of neuroblastoma, according to investigators.

Genomic analysis of neuroblastomas showed that the aforementioned markers were strongly associated with outcome and other disease characteristics, reported Sandra Ackermann, MD, of the department of experimental pediatric oncology at the University Children’s Hospital of Cologne (Germany), and her colleagues.

Although previous studies have shown relationships between genetic alterations and behavior of neuroblastomas, “to date, these genomic data have not produced a coherent model of pathogenesis that can explain the extremely divergent clinical phenotypes of neuroblastoma,” the investigators wrote in Science.

The present study involved genomic sequencing of 416 pretreatment neuroblastomas, with tests for telomere maintenance mechanisms, RAS-pathway mutations, and p53-pathway mutations.

Based on existing data, the investigators first devised a panel based on 17 genes related to the RAS pathway (11 genes included ALK) and 6 related to the p53 pathway. In 198 cases, 28 tested positive for RAS- or p53-pathway abnormalities (17.8%). Positivity was more common in high-risk tumors than non–high-risk tumors (21.3% vs. 13.3%; P = .048), and in both risk groups, positivity was associated with poor outcome (hazard ratio, 2.056; P = .001).

However, because clinical courses varied widely among non–high-risk patients with RAS/p53 mutations, the investigators recognized that a piece of the puzzle was missing. They hypothesized that telomere maintenance mechanisms could also be playing a role. Following several intervening experiments, the investigators devised telomere maintenance mechanism testing, defined by MYCN amplification or TERT rearrangements, elevated TERT expression if negative for these abnormalities, or presence of ALT-associated promyelocytic leukemia nuclear bodies. Subsequent testing revealed that positivity for these parameters was associated with a HR of 5.184 (P less than .001), thereby confirming that telomere maintenance mechanisms could independently predict survival.

“Together, our findings demonstrate that the divergent clinical phenotypes of human neuroblastoma are driven by molecular alterations affecting telomere maintenance and RAS or p53 pathways, suggesting a mechanistic classification of this malignancy,” the authors concluded.

The proposed classification scheme also includes associations with other genetic features (tumor cell ploidy, segmental copy number alterations, MYCN/TERT/ATRX alterations, and gene expression favorability) and clinical characteristics (stage of disease and age at diagnosis).

The study was funded by the German Cancer Aid, the German Ministry of Science and Education, the MYC-NET, the Deutsche Forschungsgemeinschaft, the Berlin Institute of Health, the European Union, and others. One coauthor reported financial relationships with Biogazelle and pxlence, and another reported consulting fees from NEO New Oncology.

SOURCE: Ackermann S et al. Science. 2018 Dec 7. doi: 10.1126/science.aat6768.

Publications
Topics
Sections

 

Telomere maintenance mechanisms, RAS mutations, and p53 mutations can be used to mechanistically classify clinical phenotypes of neuroblastoma, according to investigators.

Genomic analysis of neuroblastomas showed that the aforementioned markers were strongly associated with outcome and other disease characteristics, reported Sandra Ackermann, MD, of the department of experimental pediatric oncology at the University Children’s Hospital of Cologne (Germany), and her colleagues.

Although previous studies have shown relationships between genetic alterations and behavior of neuroblastomas, “to date, these genomic data have not produced a coherent model of pathogenesis that can explain the extremely divergent clinical phenotypes of neuroblastoma,” the investigators wrote in Science.

The present study involved genomic sequencing of 416 pretreatment neuroblastomas, with tests for telomere maintenance mechanisms, RAS-pathway mutations, and p53-pathway mutations.

Based on existing data, the investigators first devised a panel based on 17 genes related to the RAS pathway (11 genes included ALK) and 6 related to the p53 pathway. In 198 cases, 28 tested positive for RAS- or p53-pathway abnormalities (17.8%). Positivity was more common in high-risk tumors than non–high-risk tumors (21.3% vs. 13.3%; P = .048), and in both risk groups, positivity was associated with poor outcome (hazard ratio, 2.056; P = .001).

However, because clinical courses varied widely among non–high-risk patients with RAS/p53 mutations, the investigators recognized that a piece of the puzzle was missing. They hypothesized that telomere maintenance mechanisms could also be playing a role. Following several intervening experiments, the investigators devised telomere maintenance mechanism testing, defined by MYCN amplification or TERT rearrangements, elevated TERT expression if negative for these abnormalities, or presence of ALT-associated promyelocytic leukemia nuclear bodies. Subsequent testing revealed that positivity for these parameters was associated with a HR of 5.184 (P less than .001), thereby confirming that telomere maintenance mechanisms could independently predict survival.

“Together, our findings demonstrate that the divergent clinical phenotypes of human neuroblastoma are driven by molecular alterations affecting telomere maintenance and RAS or p53 pathways, suggesting a mechanistic classification of this malignancy,” the authors concluded.

The proposed classification scheme also includes associations with other genetic features (tumor cell ploidy, segmental copy number alterations, MYCN/TERT/ATRX alterations, and gene expression favorability) and clinical characteristics (stage of disease and age at diagnosis).

The study was funded by the German Cancer Aid, the German Ministry of Science and Education, the MYC-NET, the Deutsche Forschungsgemeinschaft, the Berlin Institute of Health, the European Union, and others. One coauthor reported financial relationships with Biogazelle and pxlence, and another reported consulting fees from NEO New Oncology.

SOURCE: Ackermann S et al. Science. 2018 Dec 7. doi: 10.1126/science.aat6768.

 

Telomere maintenance mechanisms, RAS mutations, and p53 mutations can be used to mechanistically classify clinical phenotypes of neuroblastoma, according to investigators.

Genomic analysis of neuroblastomas showed that the aforementioned markers were strongly associated with outcome and other disease characteristics, reported Sandra Ackermann, MD, of the department of experimental pediatric oncology at the University Children’s Hospital of Cologne (Germany), and her colleagues.

Although previous studies have shown relationships between genetic alterations and behavior of neuroblastomas, “to date, these genomic data have not produced a coherent model of pathogenesis that can explain the extremely divergent clinical phenotypes of neuroblastoma,” the investigators wrote in Science.

The present study involved genomic sequencing of 416 pretreatment neuroblastomas, with tests for telomere maintenance mechanisms, RAS-pathway mutations, and p53-pathway mutations.

Based on existing data, the investigators first devised a panel based on 17 genes related to the RAS pathway (11 genes included ALK) and 6 related to the p53 pathway. In 198 cases, 28 tested positive for RAS- or p53-pathway abnormalities (17.8%). Positivity was more common in high-risk tumors than non–high-risk tumors (21.3% vs. 13.3%; P = .048), and in both risk groups, positivity was associated with poor outcome (hazard ratio, 2.056; P = .001).

However, because clinical courses varied widely among non–high-risk patients with RAS/p53 mutations, the investigators recognized that a piece of the puzzle was missing. They hypothesized that telomere maintenance mechanisms could also be playing a role. Following several intervening experiments, the investigators devised telomere maintenance mechanism testing, defined by MYCN amplification or TERT rearrangements, elevated TERT expression if negative for these abnormalities, or presence of ALT-associated promyelocytic leukemia nuclear bodies. Subsequent testing revealed that positivity for these parameters was associated with a HR of 5.184 (P less than .001), thereby confirming that telomere maintenance mechanisms could independently predict survival.

“Together, our findings demonstrate that the divergent clinical phenotypes of human neuroblastoma are driven by molecular alterations affecting telomere maintenance and RAS or p53 pathways, suggesting a mechanistic classification of this malignancy,” the authors concluded.

The proposed classification scheme also includes associations with other genetic features (tumor cell ploidy, segmental copy number alterations, MYCN/TERT/ATRX alterations, and gene expression favorability) and clinical characteristics (stage of disease and age at diagnosis).

The study was funded by the German Cancer Aid, the German Ministry of Science and Education, the MYC-NET, the Deutsche Forschungsgemeinschaft, the Berlin Institute of Health, the European Union, and others. One coauthor reported financial relationships with Biogazelle and pxlence, and another reported consulting fees from NEO New Oncology.

SOURCE: Ackermann S et al. Science. 2018 Dec 7. doi: 10.1126/science.aat6768.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM SCIENCE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: A proposed mechanistic classification of clinical phenotypes in neuroblastoma is based on presence of telomere maintenance mechanisms, along with RAS and p53 mutations.

Major finding: The presence of telomere maintenance mechanisms was associated with a hazard ratio of 5.184 (P less than .001).

Study details: A genome sequencing of 416 pretreatment neuroblastomas, with tests for telomere maintenance mechanisms, RAS-pathway mutations, and p53-pathway mutations.

Disclosures: The study was funded by the German Cancer Aid, the German Ministry of Science and Education, the MYC-NET, the Deutsche Forschungsgemeinschaft, the Berlin Institute of Health, the European Union, and others. One coauthor reported financial relationships with Biogazelle and pxlence, and another reported consulting fees from NEO New Oncology.

Source: Ackermann S et al. Science. 2018 Dec 7. doi: 10.1126/science.aat6768.

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

RCC research has opened door to the future, but “much work remains to be done”

Article Type
Changed
Fri, 01/04/2019 - 14:27

Recent research on the genetic basis of renal cell carcinoma has expanded and improved treatment options; however, personalized medicine is still largely unavailable, so future efforts should aim to link genetic knowledge with prognosis and treatment selection, according to the authors of a recent review article.

The article, written by Christopher D’Avella, MD, of the Fox Chase Cancer Center in Philadelphia, and his colleagues provides an overview of renal cell carcinoma (RCC) mutations and associated therapies, with updates of ongoing trials and a look at future directions.

“The expansion of treatment options for patients with advanced RCC over the past 15 years is a testament to enhanced understanding of the genetics and genomics of RCC and the ability to apply this knowledge to drug development,” the authors wrote in Urologic Oncology. “However, much work remains to be done as there are still no validated biomarkers to select patient treatment, and in only rare cases, the knowledge of particular mutations in RCC can lead to rational treatment selection.”

RCC accounts for approximately 80%-85% of renal tumors. About three out of four RCC patients have clear cell disease, of which about 30% develop metastases and need systemic therapy. The authors pointed out that vascular endothelial growth factor tyrosine kinase inhibitors (TKIs) have been standard first-line care for these patients since the mid-2000s, based on improved molecular understanding. Still, responses to TKIs are limited and patients eventually develop resistance. Several agents are in development to overcome this obstacle, including inhibitors of hypoxia inducible factor, which have recently shown promise. Among biomarkers for ccRCC, PBRM1 mutations may be associated with susceptibility to checkpoint inhibitors, and TSC1 could predict response to mTOR (mammalian target of rapamycin) inhibition.

Along with clear cell RCC, the review article addressed topics in papillary and sarcomatoid subtypes.

Patients with papillary RCC often have MET mutations, and ongoing research is focused on associated targeted therapies. For example, savolitinib is a highly selective MET inhibitor that has shown promise in this patient subgroup.

Sarcomatoid features remain characteristic of large and aggressive tumors. Unfortunately, treatment options are currently limited in this area. Recent studies suggest that TP53 and NF2 mutations are associated with sarcomatoid differentiation.

“Future studies should explore linking genetics to prognosis, resistance to targeted therapies, and the identification of future therapeutic targets,” the authors concluded.

SOURCE: D’Avella C et al. Urol Oncol. 2018 Nov 23. doi: 10.1016/j.urolonc.2018.10.027.

Publications
Topics
Sections

Recent research on the genetic basis of renal cell carcinoma has expanded and improved treatment options; however, personalized medicine is still largely unavailable, so future efforts should aim to link genetic knowledge with prognosis and treatment selection, according to the authors of a recent review article.

The article, written by Christopher D’Avella, MD, of the Fox Chase Cancer Center in Philadelphia, and his colleagues provides an overview of renal cell carcinoma (RCC) mutations and associated therapies, with updates of ongoing trials and a look at future directions.

“The expansion of treatment options for patients with advanced RCC over the past 15 years is a testament to enhanced understanding of the genetics and genomics of RCC and the ability to apply this knowledge to drug development,” the authors wrote in Urologic Oncology. “However, much work remains to be done as there are still no validated biomarkers to select patient treatment, and in only rare cases, the knowledge of particular mutations in RCC can lead to rational treatment selection.”

RCC accounts for approximately 80%-85% of renal tumors. About three out of four RCC patients have clear cell disease, of which about 30% develop metastases and need systemic therapy. The authors pointed out that vascular endothelial growth factor tyrosine kinase inhibitors (TKIs) have been standard first-line care for these patients since the mid-2000s, based on improved molecular understanding. Still, responses to TKIs are limited and patients eventually develop resistance. Several agents are in development to overcome this obstacle, including inhibitors of hypoxia inducible factor, which have recently shown promise. Among biomarkers for ccRCC, PBRM1 mutations may be associated with susceptibility to checkpoint inhibitors, and TSC1 could predict response to mTOR (mammalian target of rapamycin) inhibition.

Along with clear cell RCC, the review article addressed topics in papillary and sarcomatoid subtypes.

Patients with papillary RCC often have MET mutations, and ongoing research is focused on associated targeted therapies. For example, savolitinib is a highly selective MET inhibitor that has shown promise in this patient subgroup.

Sarcomatoid features remain characteristic of large and aggressive tumors. Unfortunately, treatment options are currently limited in this area. Recent studies suggest that TP53 and NF2 mutations are associated with sarcomatoid differentiation.

“Future studies should explore linking genetics to prognosis, resistance to targeted therapies, and the identification of future therapeutic targets,” the authors concluded.

SOURCE: D’Avella C et al. Urol Oncol. 2018 Nov 23. doi: 10.1016/j.urolonc.2018.10.027.

Recent research on the genetic basis of renal cell carcinoma has expanded and improved treatment options; however, personalized medicine is still largely unavailable, so future efforts should aim to link genetic knowledge with prognosis and treatment selection, according to the authors of a recent review article.

The article, written by Christopher D’Avella, MD, of the Fox Chase Cancer Center in Philadelphia, and his colleagues provides an overview of renal cell carcinoma (RCC) mutations and associated therapies, with updates of ongoing trials and a look at future directions.

“The expansion of treatment options for patients with advanced RCC over the past 15 years is a testament to enhanced understanding of the genetics and genomics of RCC and the ability to apply this knowledge to drug development,” the authors wrote in Urologic Oncology. “However, much work remains to be done as there are still no validated biomarkers to select patient treatment, and in only rare cases, the knowledge of particular mutations in RCC can lead to rational treatment selection.”

RCC accounts for approximately 80%-85% of renal tumors. About three out of four RCC patients have clear cell disease, of which about 30% develop metastases and need systemic therapy. The authors pointed out that vascular endothelial growth factor tyrosine kinase inhibitors (TKIs) have been standard first-line care for these patients since the mid-2000s, based on improved molecular understanding. Still, responses to TKIs are limited and patients eventually develop resistance. Several agents are in development to overcome this obstacle, including inhibitors of hypoxia inducible factor, which have recently shown promise. Among biomarkers for ccRCC, PBRM1 mutations may be associated with susceptibility to checkpoint inhibitors, and TSC1 could predict response to mTOR (mammalian target of rapamycin) inhibition.

Along with clear cell RCC, the review article addressed topics in papillary and sarcomatoid subtypes.

Patients with papillary RCC often have MET mutations, and ongoing research is focused on associated targeted therapies. For example, savolitinib is a highly selective MET inhibitor that has shown promise in this patient subgroup.

Sarcomatoid features remain characteristic of large and aggressive tumors. Unfortunately, treatment options are currently limited in this area. Recent studies suggest that TP53 and NF2 mutations are associated with sarcomatoid differentiation.

“Future studies should explore linking genetics to prognosis, resistance to targeted therapies, and the identification of future therapeutic targets,” the authors concluded.

SOURCE: D’Avella C et al. Urol Oncol. 2018 Nov 23. doi: 10.1016/j.urolonc.2018.10.027.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM UROLOGIC ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

Key clinical point: Recent research on the genetic basis of renal cell carcinoma has expanded and improved treatment options but personalized medicine is still largely unavailable.

Major finding: There are still no validated biomarkers to select patient treatment, and in only rare cases does the knowledge of particular mutations in renal cell carcinoma lead to rational treatment selection.

Study details: A review article of mutations in renal cell carcinoma and associated treatment options.

Disclosures: This work was supported by the Canadian Cancer Society and the Canadian Institutes of Health.

Source: D’Avella C et al. Urol Oncol. 2018 Nov 23. doi: 10.1016/j.urolonc.2018.10.027.

Disqus Comments
Default
Use ProPublica

hTERT expression predicts RCC survival, tumor aggressiveness

Article Type
Changed
Fri, 01/04/2019 - 14:27

Human telomerase reverse transcriptase (hTERT) protein expression is associated with clear cell renal carcinoma (ccRCC) tumor aggressiveness and disease-specific survival (DSS), according to investigators.

Associations between hTERT expression and clinicopathologic features and outcomes were less robust or nonexistent in papillary and chromophobe subtypes, reported Leili Saeednejad Zanjani, MD, of the Oncopathology Research Center at Iran University of Medical Sciences in Tehran, and colleagues.

“Evidence shows that telomerase is expressed in 85% of malignancies, and the level of its activity is higher in advanced and metastatic tumors,” the authors wrote in Pathology.

“A number of clinical studies have been performed to evaluate the association between telomerase activity and clinicopathological parameters in renal cancer showing that telomerase activity level correlates with progression of RCC,” Dr. Zanjani and associates wrote. As none of these specifically evaluated hTERT protein expression, the investigators conducted a study to learn more.

The investigators analyzed hTERT expression level in 176 cases of RCC, requiring that each tumor had three core biopsies because of concerns of heterogeneity. The population consisted of 113 clear cell, 12 type I papillary, 20 type II papillary, and 31 chromophobe subtypes. Patient and clinicopathologic features were compared with survival and hTERT expression. Median follow-up time was 42 months.

Correlations between hTERT expression and disease characteristics were pronounced in cases of ccRCC, compared with other subtypes. In ccRCC, hTERT expression was significantly associated with tumor stage, nucleolar grade, tumor size, microvascular invasion, lymph node invasion, renal pelvis involvement, renal sinus fat involvement, Gerota fascia invasion, and distant metastasis. Survival analysis showed that DSS of ccRCC patients with high hTERT expression was 58 months, compared with 68 months for those with low hTERT expression (P =.012). Other parameters associated with survival were nucleolar grade, tumor stage, and tumor size.

For type I and II papillary subtypes, associations were found between hTERT expression and tumor stage and distant metastasis. In contrast, chromophobe RCC revealed no such relationships. No associations were found between hTERT expression and survival in any of these three latter subtypes, for slightly different reasons; no patients with type I disease died of renal cancer, disallowing creation of a Kaplan-Meier survival curve, whereas type II and chromophobe survival curves revealed insignificant relationships with hTERT expression. Along the same lines, no clinicopathologic characteristics of these subtypes were tied with survival.

“From these findings we are able to conclude that hTERT protein expression may be a novel prognostic indicator of worse outcome in tumor biopsies of patients with ccRCC, if follow up time is more prolonged,” the authors wrote. They noted that “telomerase is an attractive and ideal target for therapy due to overexpression in the majority of malignancies and low or nonexpression in most somatic cells.”

The study was funded by the Iran National Science Foundation. The authors declared no conflicts of interest.
 

SOURCE: Zanjani LS et al. Pathology. 2018 Nov 19. doi: 10.1016/j.pathol.2018.08.019.

Publications
Topics
Sections

Human telomerase reverse transcriptase (hTERT) protein expression is associated with clear cell renal carcinoma (ccRCC) tumor aggressiveness and disease-specific survival (DSS), according to investigators.

Associations between hTERT expression and clinicopathologic features and outcomes were less robust or nonexistent in papillary and chromophobe subtypes, reported Leili Saeednejad Zanjani, MD, of the Oncopathology Research Center at Iran University of Medical Sciences in Tehran, and colleagues.

“Evidence shows that telomerase is expressed in 85% of malignancies, and the level of its activity is higher in advanced and metastatic tumors,” the authors wrote in Pathology.

“A number of clinical studies have been performed to evaluate the association between telomerase activity and clinicopathological parameters in renal cancer showing that telomerase activity level correlates with progression of RCC,” Dr. Zanjani and associates wrote. As none of these specifically evaluated hTERT protein expression, the investigators conducted a study to learn more.

The investigators analyzed hTERT expression level in 176 cases of RCC, requiring that each tumor had three core biopsies because of concerns of heterogeneity. The population consisted of 113 clear cell, 12 type I papillary, 20 type II papillary, and 31 chromophobe subtypes. Patient and clinicopathologic features were compared with survival and hTERT expression. Median follow-up time was 42 months.

Correlations between hTERT expression and disease characteristics were pronounced in cases of ccRCC, compared with other subtypes. In ccRCC, hTERT expression was significantly associated with tumor stage, nucleolar grade, tumor size, microvascular invasion, lymph node invasion, renal pelvis involvement, renal sinus fat involvement, Gerota fascia invasion, and distant metastasis. Survival analysis showed that DSS of ccRCC patients with high hTERT expression was 58 months, compared with 68 months for those with low hTERT expression (P =.012). Other parameters associated with survival were nucleolar grade, tumor stage, and tumor size.

For type I and II papillary subtypes, associations were found between hTERT expression and tumor stage and distant metastasis. In contrast, chromophobe RCC revealed no such relationships. No associations were found between hTERT expression and survival in any of these three latter subtypes, for slightly different reasons; no patients with type I disease died of renal cancer, disallowing creation of a Kaplan-Meier survival curve, whereas type II and chromophobe survival curves revealed insignificant relationships with hTERT expression. Along the same lines, no clinicopathologic characteristics of these subtypes were tied with survival.

“From these findings we are able to conclude that hTERT protein expression may be a novel prognostic indicator of worse outcome in tumor biopsies of patients with ccRCC, if follow up time is more prolonged,” the authors wrote. They noted that “telomerase is an attractive and ideal target for therapy due to overexpression in the majority of malignancies and low or nonexpression in most somatic cells.”

The study was funded by the Iran National Science Foundation. The authors declared no conflicts of interest.
 

SOURCE: Zanjani LS et al. Pathology. 2018 Nov 19. doi: 10.1016/j.pathol.2018.08.019.

Human telomerase reverse transcriptase (hTERT) protein expression is associated with clear cell renal carcinoma (ccRCC) tumor aggressiveness and disease-specific survival (DSS), according to investigators.

Associations between hTERT expression and clinicopathologic features and outcomes were less robust or nonexistent in papillary and chromophobe subtypes, reported Leili Saeednejad Zanjani, MD, of the Oncopathology Research Center at Iran University of Medical Sciences in Tehran, and colleagues.

“Evidence shows that telomerase is expressed in 85% of malignancies, and the level of its activity is higher in advanced and metastatic tumors,” the authors wrote in Pathology.

“A number of clinical studies have been performed to evaluate the association between telomerase activity and clinicopathological parameters in renal cancer showing that telomerase activity level correlates with progression of RCC,” Dr. Zanjani and associates wrote. As none of these specifically evaluated hTERT protein expression, the investigators conducted a study to learn more.

The investigators analyzed hTERT expression level in 176 cases of RCC, requiring that each tumor had three core biopsies because of concerns of heterogeneity. The population consisted of 113 clear cell, 12 type I papillary, 20 type II papillary, and 31 chromophobe subtypes. Patient and clinicopathologic features were compared with survival and hTERT expression. Median follow-up time was 42 months.

Correlations between hTERT expression and disease characteristics were pronounced in cases of ccRCC, compared with other subtypes. In ccRCC, hTERT expression was significantly associated with tumor stage, nucleolar grade, tumor size, microvascular invasion, lymph node invasion, renal pelvis involvement, renal sinus fat involvement, Gerota fascia invasion, and distant metastasis. Survival analysis showed that DSS of ccRCC patients with high hTERT expression was 58 months, compared with 68 months for those with low hTERT expression (P =.012). Other parameters associated with survival were nucleolar grade, tumor stage, and tumor size.

For type I and II papillary subtypes, associations were found between hTERT expression and tumor stage and distant metastasis. In contrast, chromophobe RCC revealed no such relationships. No associations were found between hTERT expression and survival in any of these three latter subtypes, for slightly different reasons; no patients with type I disease died of renal cancer, disallowing creation of a Kaplan-Meier survival curve, whereas type II and chromophobe survival curves revealed insignificant relationships with hTERT expression. Along the same lines, no clinicopathologic characteristics of these subtypes were tied with survival.

“From these findings we are able to conclude that hTERT protein expression may be a novel prognostic indicator of worse outcome in tumor biopsies of patients with ccRCC, if follow up time is more prolonged,” the authors wrote. They noted that “telomerase is an attractive and ideal target for therapy due to overexpression in the majority of malignancies and low or nonexpression in most somatic cells.”

The study was funded by the Iran National Science Foundation. The authors declared no conflicts of interest.
 

SOURCE: Zanjani LS et al. Pathology. 2018 Nov 19. doi: 10.1016/j.pathol.2018.08.019.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM PATHOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Human telomerase reverse transcriptase (hTERT) protein expression is associated with clear cell renal carcinoma (ccRCC) tumor aggressiveness and disease-specific survival (DSS).

Major finding: DSS of ccRCC patients with high hTERT expression was 58 months, compared with 68 months for those with low hTERT expression (P equal to .012).

Study details: An analysis of hTERT protein expression and disease characteristics in 176 patients with RCC. The subtype population consisted of 113 clear cell, 12 type I papillary, 20 type II papillary, and 31 chromophobe cases.

Disclosures: The study was funded by the Iran National Science Foundation. The authors declared no conflicts of interest.

Source: Zanjani LS et al. Pathology. 2018 Nov 19. doi: 10.1016/j.pathol.2018.08.019.

Disqus Comments
Default
Use ProPublica