User login
European guideline for diagnosing C. difficile infection updated
The European Society of Clinical Microbiology and Infectious Diseases has updated its clinical guideline for diagnosing Clostridium difficile infection, according to a report in Clinical Microbiology and Infection.
“Our aim is to not only improve diagnosis of C. difficile infection, but also to standardize the diagnostic process across Europe to allow for improved surveillance of the disease,” Ed J. Kuijper, MD, of the Centre for Infectious Diseases, Leiden (the Netherlands) University Medical Centre and lead investigator for the new guideline, said in a press statement.
The Society released its first guideline regarding C. difficile diagnosis in 2009, but it required revision because many new diagnostic tests have become commercially available since then. The updated guideline focuses on diagnosing patients of all ages with diarrhea who are suspected of having C. difficile infection and is intended for use by medical microbiologists, gastroenterologists, infectious disease specialists, and infection control practitioners, said Monique J.T. Crobach, MD, who is also of Leiden University and is first author of the guideline, and her associates (Clin Microbiol Infect. 2016;22:S63-81).
They performed a comprehensive meta-analysis of 56 studies that compared 24 commercially available diagnostic assays against either of the current “gold standard” tests, the cell cytotoxicity neutralization assay or the toxigenic culture. Forty-one of these studies were published after 2009. Based on their findings, Dr. Crobach and her associates formulated 16 recommendations and suggestions, noting the quality of evidence supporting each one.
The updated guideline strongly recommends against using any single diagnostic assay to diagnose C. difficile infection, regardless of the technology on which it is based. Instead, diagnosis should rest on clinical signs and symptoms together with a two-step algorithm starting with either a nucleic acid amplification test or a glutamate dehydrogenase enzyme immunoassay. Samples with positive results on this initial test should be tested further with a toxin A/B enzyme immunoassay.
An alternative algorithm is to initially test samples with both a glutamate dehydrogenase and a toxin A/B enzyme immunoassay. The guideline spells out what actions to take in the event of concordant positive results, concordant negative results, or discordant results.
Testing for C. difficile should not be limited to samples with a specific request from a physician. All unformed stool samples from patients aged 3 years and older should be tested for the organism. In contrast, formed stool samples shouldn’t be tested for C. difficile unless the patient has paralytic ileus.
The guideline also addresses repeat testing. Performing a repeat test after an initial negative result during the same diarrheal episode may be useful in selected cases with ongoing clinical suspicion during an epidemic situation, or even in cases with high clinical suspicion during endemic situations. But repeat testing after an initial positive result generally is not recommended. Repeat testing to assess whether the patient is cured is definitely not recommended.
However, the decision to treat patients for C. difficile infection is a clinical one and may be justified even if all laboratory results are negative, Dr. Crobach and her associates noted.
The European Society of Clinical Microbiology and Infectious Diseases has updated its clinical guideline for diagnosing Clostridium difficile infection, according to a report in Clinical Microbiology and Infection.
“Our aim is to not only improve diagnosis of C. difficile infection, but also to standardize the diagnostic process across Europe to allow for improved surveillance of the disease,” Ed J. Kuijper, MD, of the Centre for Infectious Diseases, Leiden (the Netherlands) University Medical Centre and lead investigator for the new guideline, said in a press statement.
The Society released its first guideline regarding C. difficile diagnosis in 2009, but it required revision because many new diagnostic tests have become commercially available since then. The updated guideline focuses on diagnosing patients of all ages with diarrhea who are suspected of having C. difficile infection and is intended for use by medical microbiologists, gastroenterologists, infectious disease specialists, and infection control practitioners, said Monique J.T. Crobach, MD, who is also of Leiden University and is first author of the guideline, and her associates (Clin Microbiol Infect. 2016;22:S63-81).
They performed a comprehensive meta-analysis of 56 studies that compared 24 commercially available diagnostic assays against either of the current “gold standard” tests, the cell cytotoxicity neutralization assay or the toxigenic culture. Forty-one of these studies were published after 2009. Based on their findings, Dr. Crobach and her associates formulated 16 recommendations and suggestions, noting the quality of evidence supporting each one.
The updated guideline strongly recommends against using any single diagnostic assay to diagnose C. difficile infection, regardless of the technology on which it is based. Instead, diagnosis should rest on clinical signs and symptoms together with a two-step algorithm starting with either a nucleic acid amplification test or a glutamate dehydrogenase enzyme immunoassay. Samples with positive results on this initial test should be tested further with a toxin A/B enzyme immunoassay.
An alternative algorithm is to initially test samples with both a glutamate dehydrogenase and a toxin A/B enzyme immunoassay. The guideline spells out what actions to take in the event of concordant positive results, concordant negative results, or discordant results.
Testing for C. difficile should not be limited to samples with a specific request from a physician. All unformed stool samples from patients aged 3 years and older should be tested for the organism. In contrast, formed stool samples shouldn’t be tested for C. difficile unless the patient has paralytic ileus.
The guideline also addresses repeat testing. Performing a repeat test after an initial negative result during the same diarrheal episode may be useful in selected cases with ongoing clinical suspicion during an epidemic situation, or even in cases with high clinical suspicion during endemic situations. But repeat testing after an initial positive result generally is not recommended. Repeat testing to assess whether the patient is cured is definitely not recommended.
However, the decision to treat patients for C. difficile infection is a clinical one and may be justified even if all laboratory results are negative, Dr. Crobach and her associates noted.
The European Society of Clinical Microbiology and Infectious Diseases has updated its clinical guideline for diagnosing Clostridium difficile infection, according to a report in Clinical Microbiology and Infection.
“Our aim is to not only improve diagnosis of C. difficile infection, but also to standardize the diagnostic process across Europe to allow for improved surveillance of the disease,” Ed J. Kuijper, MD, of the Centre for Infectious Diseases, Leiden (the Netherlands) University Medical Centre and lead investigator for the new guideline, said in a press statement.
The Society released its first guideline regarding C. difficile diagnosis in 2009, but it required revision because many new diagnostic tests have become commercially available since then. The updated guideline focuses on diagnosing patients of all ages with diarrhea who are suspected of having C. difficile infection and is intended for use by medical microbiologists, gastroenterologists, infectious disease specialists, and infection control practitioners, said Monique J.T. Crobach, MD, who is also of Leiden University and is first author of the guideline, and her associates (Clin Microbiol Infect. 2016;22:S63-81).
They performed a comprehensive meta-analysis of 56 studies that compared 24 commercially available diagnostic assays against either of the current “gold standard” tests, the cell cytotoxicity neutralization assay or the toxigenic culture. Forty-one of these studies were published after 2009. Based on their findings, Dr. Crobach and her associates formulated 16 recommendations and suggestions, noting the quality of evidence supporting each one.
The updated guideline strongly recommends against using any single diagnostic assay to diagnose C. difficile infection, regardless of the technology on which it is based. Instead, diagnosis should rest on clinical signs and symptoms together with a two-step algorithm starting with either a nucleic acid amplification test or a glutamate dehydrogenase enzyme immunoassay. Samples with positive results on this initial test should be tested further with a toxin A/B enzyme immunoassay.
An alternative algorithm is to initially test samples with both a glutamate dehydrogenase and a toxin A/B enzyme immunoassay. The guideline spells out what actions to take in the event of concordant positive results, concordant negative results, or discordant results.
Testing for C. difficile should not be limited to samples with a specific request from a physician. All unformed stool samples from patients aged 3 years and older should be tested for the organism. In contrast, formed stool samples shouldn’t be tested for C. difficile unless the patient has paralytic ileus.
The guideline also addresses repeat testing. Performing a repeat test after an initial negative result during the same diarrheal episode may be useful in selected cases with ongoing clinical suspicion during an epidemic situation, or even in cases with high clinical suspicion during endemic situations. But repeat testing after an initial positive result generally is not recommended. Repeat testing to assess whether the patient is cured is definitely not recommended.
However, the decision to treat patients for C. difficile infection is a clinical one and may be justified even if all laboratory results are negative, Dr. Crobach and her associates noted.
FROM CLINICAL MICROBIOLOGY AND INFECTION
Key clinical point: The European Society of Clinical Microbiology and Infectious Diseases updated its guideline for diagnosing C. difficile infection.
Major finding: No single diagnostic assay should be used to diagnose C. difficile infection; diagnosis should be based on clinical signs and symptoms together with a two-step algorithm starting with either a nucleic acid amplification test or a glutamate dehydrogenase enzyme immunoassay.
Data source: A meta-analysis of 56 studies comparing the efficacy of 24 laboratory assays for diagnosing C. difficile, and a compilation of testing guidelines.
Disclosures: The meta-analysis was supported by the European Society of Clinical Microbiology and Infectious Diseases. Dr. Crobach and her associates reported having no relevant financial disclosures.
Ixekizumab improved psoriatic arthritis in patients who had not taken biologics
Two different doses of the humanized monoclonal antibody ixekizumab improved signs and symptoms of active psoriatic arthritis in a phase III manufacturer-sponsored trial of patients who had not taken a biologic drug before.
The agent selectively binds and neutralizes interleukin (IL)-17A, which promotes joint inflammation and damage via several mechanisms. So the study findings support the view that IL-17A is a key cytokine in the pathogenesis of psoriatic arthritis and an appropriate therapeutic target, said Philip J. Mease, MD, of the department of rheumatology at Swedish Medical Center and the University of Washington, Seattle, and his associates.
They are performing the ongoing, 3-year, randomized, double-blind trial (SPIRIT-P1) comparing responses with an 80-mg dose of ixekizumab every 2 weeks (103 patients), an 80-mg dose every 4 weeks (107 patients), a 40-mg dose of adalimumab (Humira) every 2 weeks (101 patients, active control group), and matching placebo (106 patients, placebo-control group). Each of the two ixekizumab arms received a starting dose of 160 mg given as two injections at week 0. This report presented the findings after the initial 24-week, double-blind treatment period of the trial.
The study participants are adults with active psoriatic arthritis who had never been treated with biologic agents and who continued taking their usual doses of conventional disease-modifying antirheumatic drugs, oral corticosteroids, opiates, and/or nonsteroidal anti-inflammatory drugs/Cox-2 inhibitors during the study. The mean patient age was 49.5 years. Of the 382 who completed this portion of the study, 57 showed an inadequate response and required rescue medication, including 10 on the lower dose of ixekizumab, 11 on the higher dose of ixekizumab, 9 taking adalimumab, and 27 taking placebo.
The primary efficacy endpoint, ACR20 response at week 24, was met by 62.1% of the higher-dose ixekizumab group, 57.9% of the lower-dose ixekizumab group, and 57.4% of the adalimumab group, all of which were significantly greater than the 30.2% rate in the placebo group. Both doses of the study drug as well as the active control drug also improved secondary endpoints: reducing mean levels of disease activity as measured by the 28-joint Disease Activity Score using on C-reactive protein, improving patient-reported physical function on the Health Assessment Questionnaire–Disability Index, and improving disease-related physical health as measured by the SF-36, the investigators said (Ann Rheum Dis. 2016 Aug 23. doi: 10.1136/annrheumdis-2016-209709).
In addition, the progression of structural joint damage, as assessed on radiographs of bone erosions and joint-space narrowing in the hands and feet, was significantly less with the three active treatments than with placebo. Among patients with the most extensive disease, a significantly greater percentage achieved Psoriasis Area and Severity Index 75 level of improvement with the three active treatments than with placebo. And among patients with nail involvement, mean improvements in Nail Psoriasis Severity Index scores were significantly higher with the three active treatments than with placebo.
Adverse effects included grade 1 and 2 neutropenia, herpes zoster involving the eyelid, gastroenteritis, esophageal candidiasis, and depression-related symptoms. All infections resolved with treatment, and none required discontinuation of the study drug.
This study was funded and sponsored by Eli Lilly, maker of ixekizumab. Dr. Mease reported receiving grants, personal fees, and other support from Eli Lilly, AbbVie, Amgen, Bristol Myers Squibb, Celgene, Crescendo, Genentech, Janssen, Pfizer, UCB Pharma, Merck, Novartis, and Corrona. His associates reported ties to numerous industry sources.
Two different doses of the humanized monoclonal antibody ixekizumab improved signs and symptoms of active psoriatic arthritis in a phase III manufacturer-sponsored trial of patients who had not taken a biologic drug before.
The agent selectively binds and neutralizes interleukin (IL)-17A, which promotes joint inflammation and damage via several mechanisms. So the study findings support the view that IL-17A is a key cytokine in the pathogenesis of psoriatic arthritis and an appropriate therapeutic target, said Philip J. Mease, MD, of the department of rheumatology at Swedish Medical Center and the University of Washington, Seattle, and his associates.
They are performing the ongoing, 3-year, randomized, double-blind trial (SPIRIT-P1) comparing responses with an 80-mg dose of ixekizumab every 2 weeks (103 patients), an 80-mg dose every 4 weeks (107 patients), a 40-mg dose of adalimumab (Humira) every 2 weeks (101 patients, active control group), and matching placebo (106 patients, placebo-control group). Each of the two ixekizumab arms received a starting dose of 160 mg given as two injections at week 0. This report presented the findings after the initial 24-week, double-blind treatment period of the trial.
The study participants are adults with active psoriatic arthritis who had never been treated with biologic agents and who continued taking their usual doses of conventional disease-modifying antirheumatic drugs, oral corticosteroids, opiates, and/or nonsteroidal anti-inflammatory drugs/Cox-2 inhibitors during the study. The mean patient age was 49.5 years. Of the 382 who completed this portion of the study, 57 showed an inadequate response and required rescue medication, including 10 on the lower dose of ixekizumab, 11 on the higher dose of ixekizumab, 9 taking adalimumab, and 27 taking placebo.
The primary efficacy endpoint, ACR20 response at week 24, was met by 62.1% of the higher-dose ixekizumab group, 57.9% of the lower-dose ixekizumab group, and 57.4% of the adalimumab group, all of which were significantly greater than the 30.2% rate in the placebo group. Both doses of the study drug as well as the active control drug also improved secondary endpoints: reducing mean levels of disease activity as measured by the 28-joint Disease Activity Score using on C-reactive protein, improving patient-reported physical function on the Health Assessment Questionnaire–Disability Index, and improving disease-related physical health as measured by the SF-36, the investigators said (Ann Rheum Dis. 2016 Aug 23. doi: 10.1136/annrheumdis-2016-209709).
In addition, the progression of structural joint damage, as assessed on radiographs of bone erosions and joint-space narrowing in the hands and feet, was significantly less with the three active treatments than with placebo. Among patients with the most extensive disease, a significantly greater percentage achieved Psoriasis Area and Severity Index 75 level of improvement with the three active treatments than with placebo. And among patients with nail involvement, mean improvements in Nail Psoriasis Severity Index scores were significantly higher with the three active treatments than with placebo.
Adverse effects included grade 1 and 2 neutropenia, herpes zoster involving the eyelid, gastroenteritis, esophageal candidiasis, and depression-related symptoms. All infections resolved with treatment, and none required discontinuation of the study drug.
This study was funded and sponsored by Eli Lilly, maker of ixekizumab. Dr. Mease reported receiving grants, personal fees, and other support from Eli Lilly, AbbVie, Amgen, Bristol Myers Squibb, Celgene, Crescendo, Genentech, Janssen, Pfizer, UCB Pharma, Merck, Novartis, and Corrona. His associates reported ties to numerous industry sources.
Two different doses of the humanized monoclonal antibody ixekizumab improved signs and symptoms of active psoriatic arthritis in a phase III manufacturer-sponsored trial of patients who had not taken a biologic drug before.
The agent selectively binds and neutralizes interleukin (IL)-17A, which promotes joint inflammation and damage via several mechanisms. So the study findings support the view that IL-17A is a key cytokine in the pathogenesis of psoriatic arthritis and an appropriate therapeutic target, said Philip J. Mease, MD, of the department of rheumatology at Swedish Medical Center and the University of Washington, Seattle, and his associates.
They are performing the ongoing, 3-year, randomized, double-blind trial (SPIRIT-P1) comparing responses with an 80-mg dose of ixekizumab every 2 weeks (103 patients), an 80-mg dose every 4 weeks (107 patients), a 40-mg dose of adalimumab (Humira) every 2 weeks (101 patients, active control group), and matching placebo (106 patients, placebo-control group). Each of the two ixekizumab arms received a starting dose of 160 mg given as two injections at week 0. This report presented the findings after the initial 24-week, double-blind treatment period of the trial.
The study participants are adults with active psoriatic arthritis who had never been treated with biologic agents and who continued taking their usual doses of conventional disease-modifying antirheumatic drugs, oral corticosteroids, opiates, and/or nonsteroidal anti-inflammatory drugs/Cox-2 inhibitors during the study. The mean patient age was 49.5 years. Of the 382 who completed this portion of the study, 57 showed an inadequate response and required rescue medication, including 10 on the lower dose of ixekizumab, 11 on the higher dose of ixekizumab, 9 taking adalimumab, and 27 taking placebo.
The primary efficacy endpoint, ACR20 response at week 24, was met by 62.1% of the higher-dose ixekizumab group, 57.9% of the lower-dose ixekizumab group, and 57.4% of the adalimumab group, all of which were significantly greater than the 30.2% rate in the placebo group. Both doses of the study drug as well as the active control drug also improved secondary endpoints: reducing mean levels of disease activity as measured by the 28-joint Disease Activity Score using on C-reactive protein, improving patient-reported physical function on the Health Assessment Questionnaire–Disability Index, and improving disease-related physical health as measured by the SF-36, the investigators said (Ann Rheum Dis. 2016 Aug 23. doi: 10.1136/annrheumdis-2016-209709).
In addition, the progression of structural joint damage, as assessed on radiographs of bone erosions and joint-space narrowing in the hands and feet, was significantly less with the three active treatments than with placebo. Among patients with the most extensive disease, a significantly greater percentage achieved Psoriasis Area and Severity Index 75 level of improvement with the three active treatments than with placebo. And among patients with nail involvement, mean improvements in Nail Psoriasis Severity Index scores were significantly higher with the three active treatments than with placebo.
Adverse effects included grade 1 and 2 neutropenia, herpes zoster involving the eyelid, gastroenteritis, esophageal candidiasis, and depression-related symptoms. All infections resolved with treatment, and none required discontinuation of the study drug.
This study was funded and sponsored by Eli Lilly, maker of ixekizumab. Dr. Mease reported receiving grants, personal fees, and other support from Eli Lilly, AbbVie, Amgen, Bristol Myers Squibb, Celgene, Crescendo, Genentech, Janssen, Pfizer, UCB Pharma, Merck, Novartis, and Corrona. His associates reported ties to numerous industry sources.
FROM ANNALS OF THE RHEUMATIC DISEASES
Key clinical point: Ixekizumab improved signs and symptoms of active psoriatic arthritis in a phase III manufacturer-sponsored trial of patients who had not taken biologics before.
Major finding: The primary endpoint, ACR20 response at week 24, was met by 62.1% of the lower-dose ixekizumab group, 57.9% of the higher-dose ixekizumab group, and 57.4% of the adalimumab group, which was significantly greater than the 30.2% rate in the placebo group.
Data source: A randomized, double-blind, placebo- and active treatment-controlled clinical trial involving 417 adults naive to biologic therapy.
Disclosures: This study was funded and sponsored by Eli Lilly, maker of ixekizumab. Dr. Mease reported receiving grants, personal fees, and other support from Eli Lilly, AbbVie, Amgen, Bristol Myers Squibb, Celgene, Crescendo, Genentech, Janssen, Pfizer, UCB Pharma, Merck, Novartis, and Corrona. His associates reported ties to numerous industry sources.
Host RNA biosignatures distinguish bacterial from viral fever
RNA-expression biosignatures derived from the patient’s peripheral blood distinguish bacterial from viral causes of fever in young children, according to two separate preliminary studies published online Aug. 23 in JAMA.
Several studies have suggested that the source of infection in febrile children might be identified by examining the pattern of host genes that are either activated or suppressed during the body’s inflammatory response. Distinguishing the relatively few but potentially life-threatening bacterial infections from the more common but milder, self-resolving viral infections is difficult, and current practice is to admit “ill-appearing” febrile children to the hospital and administer parenteral antibiotics while awaiting the results of blood and tissue cultures. Those results are often ambiguous, and the whole process represents a large burden on health care resources as well as contributing to inappropriate antibiotic treatment.
Two multinational research groups developed different techniques for detecting RNA biosignatures in patients’ blood samples, then assessed the accuracy of those tests in validation cohorts. One group focused on ruling out bacterial infection as the source of fever in young children (median age, 19 months), while the other investigated whether the host responses of the youngest children (aged 60 days and younger), who have immature immune systems, are robust enough to allow detection of RNA biosignatures.
In the discovery phase of the first study, analysis of RNA gene expression was performed on blood samples obtained from 240 children at admission to hospitals in the United Kingdom, Spain, and the United States during a 4-year period. A total of 8,565 RNA transcript signatures were identified as potential biomarkers to discriminate between viral and bacterial infection. This was narrowed down to 38 transcript signatures, and then to only 2 – IFI44L and FAM89A – that were used to devise a Disease Risk Score (DRS) for each patient, said Jethro A. Herberg, PhD, of the division of infectious diseases, Imperial College London, and his associates.
IFI44L expression was increased in patients who had viral infection, while FAM89A expression was increased in those who had bacterial infection, as compared with healthy children. (In previous studies, IFI44L was reported to be up-regulated in interferon-mediated antiviral responses and FAM89A was reported to be elevated among children with septic shock.)
The DRS showed 90% sensitivity in distinguishing viral from bacterial infection in the discovery cohort. It then showed 96.4% sensitivity in a validation cohort of 130 febrile children (mean age, 17 months). The DRS also identified bacterial infection in a validation cohort of 24 children with meningococcal infection (91.7% sensitivity and 96.0% specificity), and distinguished it from inflammatory conditions in another cohort of 30 children with juvenile idiopathic arthritis and 18 with Henoch-Schönlein purpura (90.0% sensitivity and 95.8% specificity).
The DRS discriminated among viral, bacterial, and inflammatory diseases including systemic lupus erythematosus in a further validation cohort, a published dataset from children and adults who had all three types of illness. It was accurate regardless of the severity of infection and regardless of the duration of infection, as well as in cases where patients were coinfected with both virus and bacteria, the investigators said (JAMA. 2016 Aug 23. doi:10.1001/jama.2016.11236).
“The DRS signature, distinguishing viral from bacterial infections with only two transcripts, has potential to be translated into a clinically applicable test using current technology. Furthermore, new methods for rapid detection of nucleic acids, including nanoparticles and electrical impedance, have potential for low-cost, rapid analysis of multitranscript signatures,” Dr. Herberg and his associates noted.
Further research is needed to assess the accuracy and clinical utility of this technique in different settings, they added.
In the second study, RNA gene expression was analyzed from blood samples from 1,883 febrile infants (median age, 37 days) “who posed diagnostic quandaries” at admission to 22 emergency departments during a 2-year period. The discovery phase involved 89 of these infants who ultimately were found to have bacterial infections (bacteremia or UTIs), 190 who didn’t have bacterial infections (enterovirus, influenza, or other viruses), and 19 healthy control infants, said Prashant Mahajan, MD, division chief and research director, pediatric emergency medicine, Children’s Hospital of Michigan, Detroit, and his associates.
The investigators identified 3,753 RNA transcript signatures that could potentially identify or rule out bacterial sources of infection, which they then narrowed down to 66. This set of 66 signatures showed 82% sensitivity and 88% specificity in the discovery cohort and 87% sensitivity and 89% specificity in a validation cohort.
“The bacterial RNA biosignature was notably more predictive of bacterial infection than clinical examination” and use of the Yale Observation Score, and it “added significantly to prediction beyond the YOS alone,” Dr. Mahajan and his associates said (JAMA. 2016 Aug 23. doi: 10.1001/jama.2016.9207).
“Despite the young age of the febrile infants evaluated, they carried robust RNA biosignatures and demonstrated that regardless of the etiology of the infections, their immune systems are programed to respond not only with shared elements induced by common microbes but also with specific patterns that allow discrimination by class of pathogen,” they noted.
Further research is needed to confirm and refine these preliminary results. “As technology advances, RNA biosignatures may prove to be an alternative and accurate method to identify infants with bacterial infections. This would help clinicians target evaluation and therapy when they are needed and avoid invasive procedures, antibiotics, and hospitalizations when they are not,” Dr. Mahajan and his associates said.
Dr. Herberg’s study was supported by the Imperial College Comprehensive Biomedical Research Center, the National Institutes of Health, the European Union’s Seventh Framework Program, and numerous other groups. Dr. Herberg reported having no relevant financial disclosures. Dr. Mahajan’s study was supported by the Health Resources and Services Administration, Emergency Services for Children, the Eunice Kennedy Shriver National Institute of Child Health and Human Development, and the National Institutes of Health. Dr. Mahajan reported having no relevant financial disclosures.
The work by Herberg et al. and Mahajan et al. represents an important advance: the potential of genetics to help in the evaluation of febrile children.
Dr. Howard Bauchner |
Clearly RNA sequencing and other methods for RNA quantification are in the early days, and clinical applications must await further replication and refinement of these results in rigorous studies. But the day may soon arise when a parent of a febrile child may do a laboratory test at home, call a physician, and mutually decide whether the child should be seen for further evaluation.
Howard Bauchner, MD, is JAMA Editor in Chief. He reported having no relevant financial disclosures. Dr. Bauchner made these remarks in an editorial accompanying the two reports on RNA biosignatures (JAMA 2016;316:824-5).
The work by Herberg et al. and Mahajan et al. represents an important advance: the potential of genetics to help in the evaluation of febrile children.
Dr. Howard Bauchner |
Clearly RNA sequencing and other methods for RNA quantification are in the early days, and clinical applications must await further replication and refinement of these results in rigorous studies. But the day may soon arise when a parent of a febrile child may do a laboratory test at home, call a physician, and mutually decide whether the child should be seen for further evaluation.
Howard Bauchner, MD, is JAMA Editor in Chief. He reported having no relevant financial disclosures. Dr. Bauchner made these remarks in an editorial accompanying the two reports on RNA biosignatures (JAMA 2016;316:824-5).
The work by Herberg et al. and Mahajan et al. represents an important advance: the potential of genetics to help in the evaluation of febrile children.
Dr. Howard Bauchner |
Clearly RNA sequencing and other methods for RNA quantification are in the early days, and clinical applications must await further replication and refinement of these results in rigorous studies. But the day may soon arise when a parent of a febrile child may do a laboratory test at home, call a physician, and mutually decide whether the child should be seen for further evaluation.
Howard Bauchner, MD, is JAMA Editor in Chief. He reported having no relevant financial disclosures. Dr. Bauchner made these remarks in an editorial accompanying the two reports on RNA biosignatures (JAMA 2016;316:824-5).
RNA-expression biosignatures derived from the patient’s peripheral blood distinguish bacterial from viral causes of fever in young children, according to two separate preliminary studies published online Aug. 23 in JAMA.
Several studies have suggested that the source of infection in febrile children might be identified by examining the pattern of host genes that are either activated or suppressed during the body’s inflammatory response. Distinguishing the relatively few but potentially life-threatening bacterial infections from the more common but milder, self-resolving viral infections is difficult, and current practice is to admit “ill-appearing” febrile children to the hospital and administer parenteral antibiotics while awaiting the results of blood and tissue cultures. Those results are often ambiguous, and the whole process represents a large burden on health care resources as well as contributing to inappropriate antibiotic treatment.
Two multinational research groups developed different techniques for detecting RNA biosignatures in patients’ blood samples, then assessed the accuracy of those tests in validation cohorts. One group focused on ruling out bacterial infection as the source of fever in young children (median age, 19 months), while the other investigated whether the host responses of the youngest children (aged 60 days and younger), who have immature immune systems, are robust enough to allow detection of RNA biosignatures.
In the discovery phase of the first study, analysis of RNA gene expression was performed on blood samples obtained from 240 children at admission to hospitals in the United Kingdom, Spain, and the United States during a 4-year period. A total of 8,565 RNA transcript signatures were identified as potential biomarkers to discriminate between viral and bacterial infection. This was narrowed down to 38 transcript signatures, and then to only 2 – IFI44L and FAM89A – that were used to devise a Disease Risk Score (DRS) for each patient, said Jethro A. Herberg, PhD, of the division of infectious diseases, Imperial College London, and his associates.
IFI44L expression was increased in patients who had viral infection, while FAM89A expression was increased in those who had bacterial infection, as compared with healthy children. (In previous studies, IFI44L was reported to be up-regulated in interferon-mediated antiviral responses and FAM89A was reported to be elevated among children with septic shock.)
The DRS showed 90% sensitivity in distinguishing viral from bacterial infection in the discovery cohort. It then showed 96.4% sensitivity in a validation cohort of 130 febrile children (mean age, 17 months). The DRS also identified bacterial infection in a validation cohort of 24 children with meningococcal infection (91.7% sensitivity and 96.0% specificity), and distinguished it from inflammatory conditions in another cohort of 30 children with juvenile idiopathic arthritis and 18 with Henoch-Schönlein purpura (90.0% sensitivity and 95.8% specificity).
The DRS discriminated among viral, bacterial, and inflammatory diseases including systemic lupus erythematosus in a further validation cohort, a published dataset from children and adults who had all three types of illness. It was accurate regardless of the severity of infection and regardless of the duration of infection, as well as in cases where patients were coinfected with both virus and bacteria, the investigators said (JAMA. 2016 Aug 23. doi:10.1001/jama.2016.11236).
“The DRS signature, distinguishing viral from bacterial infections with only two transcripts, has potential to be translated into a clinically applicable test using current technology. Furthermore, new methods for rapid detection of nucleic acids, including nanoparticles and electrical impedance, have potential for low-cost, rapid analysis of multitranscript signatures,” Dr. Herberg and his associates noted.
Further research is needed to assess the accuracy and clinical utility of this technique in different settings, they added.
In the second study, RNA gene expression was analyzed from blood samples from 1,883 febrile infants (median age, 37 days) “who posed diagnostic quandaries” at admission to 22 emergency departments during a 2-year period. The discovery phase involved 89 of these infants who ultimately were found to have bacterial infections (bacteremia or UTIs), 190 who didn’t have bacterial infections (enterovirus, influenza, or other viruses), and 19 healthy control infants, said Prashant Mahajan, MD, division chief and research director, pediatric emergency medicine, Children’s Hospital of Michigan, Detroit, and his associates.
The investigators identified 3,753 RNA transcript signatures that could potentially identify or rule out bacterial sources of infection, which they then narrowed down to 66. This set of 66 signatures showed 82% sensitivity and 88% specificity in the discovery cohort and 87% sensitivity and 89% specificity in a validation cohort.
“The bacterial RNA biosignature was notably more predictive of bacterial infection than clinical examination” and use of the Yale Observation Score, and it “added significantly to prediction beyond the YOS alone,” Dr. Mahajan and his associates said (JAMA. 2016 Aug 23. doi: 10.1001/jama.2016.9207).
“Despite the young age of the febrile infants evaluated, they carried robust RNA biosignatures and demonstrated that regardless of the etiology of the infections, their immune systems are programed to respond not only with shared elements induced by common microbes but also with specific patterns that allow discrimination by class of pathogen,” they noted.
Further research is needed to confirm and refine these preliminary results. “As technology advances, RNA biosignatures may prove to be an alternative and accurate method to identify infants with bacterial infections. This would help clinicians target evaluation and therapy when they are needed and avoid invasive procedures, antibiotics, and hospitalizations when they are not,” Dr. Mahajan and his associates said.
Dr. Herberg’s study was supported by the Imperial College Comprehensive Biomedical Research Center, the National Institutes of Health, the European Union’s Seventh Framework Program, and numerous other groups. Dr. Herberg reported having no relevant financial disclosures. Dr. Mahajan’s study was supported by the Health Resources and Services Administration, Emergency Services for Children, the Eunice Kennedy Shriver National Institute of Child Health and Human Development, and the National Institutes of Health. Dr. Mahajan reported having no relevant financial disclosures.
RNA-expression biosignatures derived from the patient’s peripheral blood distinguish bacterial from viral causes of fever in young children, according to two separate preliminary studies published online Aug. 23 in JAMA.
Several studies have suggested that the source of infection in febrile children might be identified by examining the pattern of host genes that are either activated or suppressed during the body’s inflammatory response. Distinguishing the relatively few but potentially life-threatening bacterial infections from the more common but milder, self-resolving viral infections is difficult, and current practice is to admit “ill-appearing” febrile children to the hospital and administer parenteral antibiotics while awaiting the results of blood and tissue cultures. Those results are often ambiguous, and the whole process represents a large burden on health care resources as well as contributing to inappropriate antibiotic treatment.
Two multinational research groups developed different techniques for detecting RNA biosignatures in patients’ blood samples, then assessed the accuracy of those tests in validation cohorts. One group focused on ruling out bacterial infection as the source of fever in young children (median age, 19 months), while the other investigated whether the host responses of the youngest children (aged 60 days and younger), who have immature immune systems, are robust enough to allow detection of RNA biosignatures.
In the discovery phase of the first study, analysis of RNA gene expression was performed on blood samples obtained from 240 children at admission to hospitals in the United Kingdom, Spain, and the United States during a 4-year period. A total of 8,565 RNA transcript signatures were identified as potential biomarkers to discriminate between viral and bacterial infection. This was narrowed down to 38 transcript signatures, and then to only 2 – IFI44L and FAM89A – that were used to devise a Disease Risk Score (DRS) for each patient, said Jethro A. Herberg, PhD, of the division of infectious diseases, Imperial College London, and his associates.
IFI44L expression was increased in patients who had viral infection, while FAM89A expression was increased in those who had bacterial infection, as compared with healthy children. (In previous studies, IFI44L was reported to be up-regulated in interferon-mediated antiviral responses and FAM89A was reported to be elevated among children with septic shock.)
The DRS showed 90% sensitivity in distinguishing viral from bacterial infection in the discovery cohort. It then showed 96.4% sensitivity in a validation cohort of 130 febrile children (mean age, 17 months). The DRS also identified bacterial infection in a validation cohort of 24 children with meningococcal infection (91.7% sensitivity and 96.0% specificity), and distinguished it from inflammatory conditions in another cohort of 30 children with juvenile idiopathic arthritis and 18 with Henoch-Schönlein purpura (90.0% sensitivity and 95.8% specificity).
The DRS discriminated among viral, bacterial, and inflammatory diseases including systemic lupus erythematosus in a further validation cohort, a published dataset from children and adults who had all three types of illness. It was accurate regardless of the severity of infection and regardless of the duration of infection, as well as in cases where patients were coinfected with both virus and bacteria, the investigators said (JAMA. 2016 Aug 23. doi:10.1001/jama.2016.11236).
“The DRS signature, distinguishing viral from bacterial infections with only two transcripts, has potential to be translated into a clinically applicable test using current technology. Furthermore, new methods for rapid detection of nucleic acids, including nanoparticles and electrical impedance, have potential for low-cost, rapid analysis of multitranscript signatures,” Dr. Herberg and his associates noted.
Further research is needed to assess the accuracy and clinical utility of this technique in different settings, they added.
In the second study, RNA gene expression was analyzed from blood samples from 1,883 febrile infants (median age, 37 days) “who posed diagnostic quandaries” at admission to 22 emergency departments during a 2-year period. The discovery phase involved 89 of these infants who ultimately were found to have bacterial infections (bacteremia or UTIs), 190 who didn’t have bacterial infections (enterovirus, influenza, or other viruses), and 19 healthy control infants, said Prashant Mahajan, MD, division chief and research director, pediatric emergency medicine, Children’s Hospital of Michigan, Detroit, and his associates.
The investigators identified 3,753 RNA transcript signatures that could potentially identify or rule out bacterial sources of infection, which they then narrowed down to 66. This set of 66 signatures showed 82% sensitivity and 88% specificity in the discovery cohort and 87% sensitivity and 89% specificity in a validation cohort.
“The bacterial RNA biosignature was notably more predictive of bacterial infection than clinical examination” and use of the Yale Observation Score, and it “added significantly to prediction beyond the YOS alone,” Dr. Mahajan and his associates said (JAMA. 2016 Aug 23. doi: 10.1001/jama.2016.9207).
“Despite the young age of the febrile infants evaluated, they carried robust RNA biosignatures and demonstrated that regardless of the etiology of the infections, their immune systems are programed to respond not only with shared elements induced by common microbes but also with specific patterns that allow discrimination by class of pathogen,” they noted.
Further research is needed to confirm and refine these preliminary results. “As technology advances, RNA biosignatures may prove to be an alternative and accurate method to identify infants with bacterial infections. This would help clinicians target evaluation and therapy when they are needed and avoid invasive procedures, antibiotics, and hospitalizations when they are not,” Dr. Mahajan and his associates said.
Dr. Herberg’s study was supported by the Imperial College Comprehensive Biomedical Research Center, the National Institutes of Health, the European Union’s Seventh Framework Program, and numerous other groups. Dr. Herberg reported having no relevant financial disclosures. Dr. Mahajan’s study was supported by the Health Resources and Services Administration, Emergency Services for Children, the Eunice Kennedy Shriver National Institute of Child Health and Human Development, and the National Institutes of Health. Dr. Mahajan reported having no relevant financial disclosures.
FROM JAMA
Key clinical point: RNA biosignatures derived from the patient’s peripheral blood distinguish bacterial from viral causes of fever in young children.
Major finding: Study 1. The DRS showed 96.4% sensitivity in a validation cohort of 130 febrile children. Study 2. A set of 66 RNA transcript signatures showed 82% sensitivity and 88% specificity in the discovery cohort and 87% sensitivity and 89% specificity in a validation cohort.
Data source: Two separate preliminary studies developing and validating tests of host responses to infection, involving 240 and 279 patients, respectively.
Disclosures: Dr. Herberg’s study was supported by the Imperial College Comprehensive Biomedical Research Center, the National Institutes of Health, the European Union’s Seventh Framework Program, and numerous other groups. Dr. Herberg reported having no relevant financial disclosures. Dr. Mahajan’s study was supported by the Health Resources and Services Administration, Emergency Services for Children, the Eunice Kennedy Shriver National Institute of Child Health and Human Development, and the National Institutes of Health. Dr. Mahajan reported having no relevant financial disclosures.
Treatment may allow HSCT without radiation, chemotherapy
A new therapy combining an anti-c-Kit monoclonal antibody with a CD47 blocker allowed hematopoietic stem cell engraftment in immunocompetent mice without the need for toxic preconditioning using radiation or chemotherapy, according to a report published in Science Translational Medicine.
Until now, hematopoietic stem cell transplantation has required rigorous conditioning regimens to clear out the host’s bone marrow, which can cause lifelong complications. So the procedure has been reserved for patients whose life-threatening disorders justified such toxicity. “Safer and more targeted conditioning protocols could both improve the safety of transplantation and extend the existing clinical utility of this powerful form of cell therapy,” said Akanksha Chhabra, PhD, of the department of blood and marrow transplantation, Stanford (Calif.) University, and her associates.
They assessed the new combined treatment in a series of laboratory and mouse studies. The opsonizing anti-c-Kit monoclonal antibodies induced robust depletion of functional hematopoietic stem cells in immunocompetent mice, which allowed donor stem cells to engraft in these hosts. Adding the T-cell–depleting CD47-antagonists further facilitated immune ablation of host stem cells and progenitor cells. Combined, the two agents eliminated more than 99% of host hematopoietic stem cells in the bone marrow and enabled strong engraftment of the donor stem cells, while avoiding radiation- and chemotherapy-related adverse effects.
The main toxicities that occurred in treated mice were, as expected, reductions in hematologic parameters, especially red blood cell indices. This may be related to a factor in mouse physiology that is not present in humans. But if such toxicities do develop in human subjects, they can be mitigated by careful monitoring and occasional supportive transfusions, Dr. Chhabra and her associates said (Sci Transl Med. 2016;8:351ra105).
These two types of antibodies are already being investigated separately in early-phase clinical trials. If the combined treatment proves effective and safe in humans – a question that awaits further clinical studies – hematopoietic stem cell transplantation might be extended to nonmalignant conditions such as inherited immunodeficiency, inborn errors of metabolism, and hemoglobinopathies. It might also be adapted for use in solid-organ transplants, the researchers added.
This work was supported by the Virginia and D.K. Ludwig Fund for Cancer Research and several other nonprofit organizations, the California Institute for Regenerative Medicine, and the National Institutes of Health. Dr. Chhabra is a coinventor on a patent described in this article, and her associates are cofounders of Forty Seven, the company that licensed the technology for radiation- and chemotherapy-free stem-cell transplantation. Two associates also serve as advisors for Alexo Therapeutics, which develops CD47-based treatments.
A new therapy combining an anti-c-Kit monoclonal antibody with a CD47 blocker allowed hematopoietic stem cell engraftment in immunocompetent mice without the need for toxic preconditioning using radiation or chemotherapy, according to a report published in Science Translational Medicine.
Until now, hematopoietic stem cell transplantation has required rigorous conditioning regimens to clear out the host’s bone marrow, which can cause lifelong complications. So the procedure has been reserved for patients whose life-threatening disorders justified such toxicity. “Safer and more targeted conditioning protocols could both improve the safety of transplantation and extend the existing clinical utility of this powerful form of cell therapy,” said Akanksha Chhabra, PhD, of the department of blood and marrow transplantation, Stanford (Calif.) University, and her associates.
They assessed the new combined treatment in a series of laboratory and mouse studies. The opsonizing anti-c-Kit monoclonal antibodies induced robust depletion of functional hematopoietic stem cells in immunocompetent mice, which allowed donor stem cells to engraft in these hosts. Adding the T-cell–depleting CD47-antagonists further facilitated immune ablation of host stem cells and progenitor cells. Combined, the two agents eliminated more than 99% of host hematopoietic stem cells in the bone marrow and enabled strong engraftment of the donor stem cells, while avoiding radiation- and chemotherapy-related adverse effects.
The main toxicities that occurred in treated mice were, as expected, reductions in hematologic parameters, especially red blood cell indices. This may be related to a factor in mouse physiology that is not present in humans. But if such toxicities do develop in human subjects, they can be mitigated by careful monitoring and occasional supportive transfusions, Dr. Chhabra and her associates said (Sci Transl Med. 2016;8:351ra105).
These two types of antibodies are already being investigated separately in early-phase clinical trials. If the combined treatment proves effective and safe in humans – a question that awaits further clinical studies – hematopoietic stem cell transplantation might be extended to nonmalignant conditions such as inherited immunodeficiency, inborn errors of metabolism, and hemoglobinopathies. It might also be adapted for use in solid-organ transplants, the researchers added.
This work was supported by the Virginia and D.K. Ludwig Fund for Cancer Research and several other nonprofit organizations, the California Institute for Regenerative Medicine, and the National Institutes of Health. Dr. Chhabra is a coinventor on a patent described in this article, and her associates are cofounders of Forty Seven, the company that licensed the technology for radiation- and chemotherapy-free stem-cell transplantation. Two associates also serve as advisors for Alexo Therapeutics, which develops CD47-based treatments.
A new therapy combining an anti-c-Kit monoclonal antibody with a CD47 blocker allowed hematopoietic stem cell engraftment in immunocompetent mice without the need for toxic preconditioning using radiation or chemotherapy, according to a report published in Science Translational Medicine.
Until now, hematopoietic stem cell transplantation has required rigorous conditioning regimens to clear out the host’s bone marrow, which can cause lifelong complications. So the procedure has been reserved for patients whose life-threatening disorders justified such toxicity. “Safer and more targeted conditioning protocols could both improve the safety of transplantation and extend the existing clinical utility of this powerful form of cell therapy,” said Akanksha Chhabra, PhD, of the department of blood and marrow transplantation, Stanford (Calif.) University, and her associates.
They assessed the new combined treatment in a series of laboratory and mouse studies. The opsonizing anti-c-Kit monoclonal antibodies induced robust depletion of functional hematopoietic stem cells in immunocompetent mice, which allowed donor stem cells to engraft in these hosts. Adding the T-cell–depleting CD47-antagonists further facilitated immune ablation of host stem cells and progenitor cells. Combined, the two agents eliminated more than 99% of host hematopoietic stem cells in the bone marrow and enabled strong engraftment of the donor stem cells, while avoiding radiation- and chemotherapy-related adverse effects.
The main toxicities that occurred in treated mice were, as expected, reductions in hematologic parameters, especially red blood cell indices. This may be related to a factor in mouse physiology that is not present in humans. But if such toxicities do develop in human subjects, they can be mitigated by careful monitoring and occasional supportive transfusions, Dr. Chhabra and her associates said (Sci Transl Med. 2016;8:351ra105).
These two types of antibodies are already being investigated separately in early-phase clinical trials. If the combined treatment proves effective and safe in humans – a question that awaits further clinical studies – hematopoietic stem cell transplantation might be extended to nonmalignant conditions such as inherited immunodeficiency, inborn errors of metabolism, and hemoglobinopathies. It might also be adapted for use in solid-organ transplants, the researchers added.
This work was supported by the Virginia and D.K. Ludwig Fund for Cancer Research and several other nonprofit organizations, the California Institute for Regenerative Medicine, and the National Institutes of Health. Dr. Chhabra is a coinventor on a patent described in this article, and her associates are cofounders of Forty Seven, the company that licensed the technology for radiation- and chemotherapy-free stem-cell transplantation. Two associates also serve as advisors for Alexo Therapeutics, which develops CD47-based treatments.
FROM SCIENCE TRANSLATIONAL MEDICINE
Key clinical point: A new treatment allowed hematopoietic stem cell engraftment in immunocompetent mice without the need for toxic preconditioning using radiation or chemotherapy.
Major finding: The combined therapy eliminated more than 99% of host hematopoietic stem cells.
Data source: A series of laboratory and mouse studies of combined treatment with anti-c-Kit monoclonal antibodies plus CD47 blockers.
Disclosures: This work was supported by the Virginia and D.K. Ludwig Fund for Cancer Research and several other nonprofit organizations, the California Institute for Regenerative Medicine, and the National Institutes of Health. Dr. Chhabra is a coinventor on a patent described in this article, and her associates are cofounders of Forty Seven, the company that licensed the technology for radiation- and chemotherapy-free stem-cell transplantation. Two associates also serve as advisors for Alexo Therapeutics, which develops CD47-based treatments.
NAF1 gene mutations predispose to pulmonary fibrosis, emphysema
Rare frameshift mutations in the NAF1 gene were discovered to cause a telomere-shortening syndrome which, among other adverse effects, predisposes carriers to develop pulmonary fibrosis (PF) and emphysema, according to a report published in Science Translational Medicine.
“Our findings here ... highlight how telomere shortening is a relevant mechanism for PF-emphysema susceptibility in a subset of patients beyond those with mutations in the telomerase core components. It is thus possible that efforts to reverse the telomere defect, or other regenerative approaches, will influence the natural history of these progressive pathologies in patients with telomere-mediated lung disease,” said Susan E. Stanley, an MD-PhD candidate in the department of oncology, Johns Hopkins University, Baltimore, and her associates.
Pulmonary fibrosis and emphysema cluster in some families, but the genetic basis of such cases is poorly understood. Both PF and emphysema have been linked to premature aging of lung tissue and to abnormalities in the maintenance of telomere length. In addition, at least half of patients with familial and sporadic PF, and many with emphysema, have the clinical features of a short-telomere syndrome, including bone marrow failure/myelodysplastic syndrome, liver disease, and infertility.
The diagnosis of a short-telomere syndrome, as opposed to isolated PF-emphysema, is essential for appropriate treatment because if the defect is systemic, patients will “show exquisite sensitivity to otherwise tolerated medications and procedures, especially in the setting of lung transplantation,” the investigators said (Sci Transl Med. 2016;8:351ra107).
To explore the genetic basis of familial PF-emphysema, the researchers performed a series of studies, beginning with whole-genome sequencing on peripheral blood samples from five unrelated probands in familial PF-emphysema pedigrees. These participants had abnormally short telomeres and extrapulmonary features of short-telomere syndrome. Three of them who had low levels of the telomerase RNA component TR were selected for a candidate gene search, which revealed the NAF1 mutations.
The mutations were then found to be present in 2 of 30 (7%) affected members of a prevalence cohort but in none of 134 unaffected control subjects (0%), and in none of 9,006 samples from a public database of unaffected people (0%). Further genetic laboratory and mouse studies were performed to link the mutations with specific pathologies and to trace their functional effects. Their results led the researchers to conclude that these rare NAF1 variants interfere with RNA biogenesis, causing short telomeres resulting in lung disease and other abnormalities.
This work was supported by the National Institutes of Health, the Commonwealth Foundation, and the American Cancer Society. Ms. Stanley and her associates reported having no relevant financial disclosures.
Rare frameshift mutations in the NAF1 gene were discovered to cause a telomere-shortening syndrome which, among other adverse effects, predisposes carriers to develop pulmonary fibrosis (PF) and emphysema, according to a report published in Science Translational Medicine.
“Our findings here ... highlight how telomere shortening is a relevant mechanism for PF-emphysema susceptibility in a subset of patients beyond those with mutations in the telomerase core components. It is thus possible that efforts to reverse the telomere defect, or other regenerative approaches, will influence the natural history of these progressive pathologies in patients with telomere-mediated lung disease,” said Susan E. Stanley, an MD-PhD candidate in the department of oncology, Johns Hopkins University, Baltimore, and her associates.
Pulmonary fibrosis and emphysema cluster in some families, but the genetic basis of such cases is poorly understood. Both PF and emphysema have been linked to premature aging of lung tissue and to abnormalities in the maintenance of telomere length. In addition, at least half of patients with familial and sporadic PF, and many with emphysema, have the clinical features of a short-telomere syndrome, including bone marrow failure/myelodysplastic syndrome, liver disease, and infertility.
The diagnosis of a short-telomere syndrome, as opposed to isolated PF-emphysema, is essential for appropriate treatment because if the defect is systemic, patients will “show exquisite sensitivity to otherwise tolerated medications and procedures, especially in the setting of lung transplantation,” the investigators said (Sci Transl Med. 2016;8:351ra107).
To explore the genetic basis of familial PF-emphysema, the researchers performed a series of studies, beginning with whole-genome sequencing on peripheral blood samples from five unrelated probands in familial PF-emphysema pedigrees. These participants had abnormally short telomeres and extrapulmonary features of short-telomere syndrome. Three of them who had low levels of the telomerase RNA component TR were selected for a candidate gene search, which revealed the NAF1 mutations.
The mutations were then found to be present in 2 of 30 (7%) affected members of a prevalence cohort but in none of 134 unaffected control subjects (0%), and in none of 9,006 samples from a public database of unaffected people (0%). Further genetic laboratory and mouse studies were performed to link the mutations with specific pathologies and to trace their functional effects. Their results led the researchers to conclude that these rare NAF1 variants interfere with RNA biogenesis, causing short telomeres resulting in lung disease and other abnormalities.
This work was supported by the National Institutes of Health, the Commonwealth Foundation, and the American Cancer Society. Ms. Stanley and her associates reported having no relevant financial disclosures.
Rare frameshift mutations in the NAF1 gene were discovered to cause a telomere-shortening syndrome which, among other adverse effects, predisposes carriers to develop pulmonary fibrosis (PF) and emphysema, according to a report published in Science Translational Medicine.
“Our findings here ... highlight how telomere shortening is a relevant mechanism for PF-emphysema susceptibility in a subset of patients beyond those with mutations in the telomerase core components. It is thus possible that efforts to reverse the telomere defect, or other regenerative approaches, will influence the natural history of these progressive pathologies in patients with telomere-mediated lung disease,” said Susan E. Stanley, an MD-PhD candidate in the department of oncology, Johns Hopkins University, Baltimore, and her associates.
Pulmonary fibrosis and emphysema cluster in some families, but the genetic basis of such cases is poorly understood. Both PF and emphysema have been linked to premature aging of lung tissue and to abnormalities in the maintenance of telomere length. In addition, at least half of patients with familial and sporadic PF, and many with emphysema, have the clinical features of a short-telomere syndrome, including bone marrow failure/myelodysplastic syndrome, liver disease, and infertility.
The diagnosis of a short-telomere syndrome, as opposed to isolated PF-emphysema, is essential for appropriate treatment because if the defect is systemic, patients will “show exquisite sensitivity to otherwise tolerated medications and procedures, especially in the setting of lung transplantation,” the investigators said (Sci Transl Med. 2016;8:351ra107).
To explore the genetic basis of familial PF-emphysema, the researchers performed a series of studies, beginning with whole-genome sequencing on peripheral blood samples from five unrelated probands in familial PF-emphysema pedigrees. These participants had abnormally short telomeres and extrapulmonary features of short-telomere syndrome. Three of them who had low levels of the telomerase RNA component TR were selected for a candidate gene search, which revealed the NAF1 mutations.
The mutations were then found to be present in 2 of 30 (7%) affected members of a prevalence cohort but in none of 134 unaffected control subjects (0%), and in none of 9,006 samples from a public database of unaffected people (0%). Further genetic laboratory and mouse studies were performed to link the mutations with specific pathologies and to trace their functional effects. Their results led the researchers to conclude that these rare NAF1 variants interfere with RNA biogenesis, causing short telomeres resulting in lung disease and other abnormalities.
This work was supported by the National Institutes of Health, the Commonwealth Foundation, and the American Cancer Society. Ms. Stanley and her associates reported having no relevant financial disclosures.
FROM SCIENCE TRANSLATIONAL MEDICINE
Key clinical point: Certain rare mutations in the NAF1 gene were discovered to predispose carriers to develop pulmonary fibrosis and emphysema.
Major finding: The rare NAF1 mutations were detected in 2 of 30 (7%) family members in an affected pedigree but in 0 of 134 controls.
Data source: A series of genetic sequencing and other studies involving five affected probands, 30 unrelated but affected patients, and 134 control subjects.
Disclosures: This work was supported by the National Institutes of Health, the Commonwealth Foundation, and the American Cancer Society. Ms. Stanley and her associates reported having no relevant financial disclosures.
Flu vaccine prevented hospitalizations in patients 50 and older
The seasonal influenza vaccination reduced flu-related hospitalizations by 56.8% among people aged 50 and older during a recent flu season, according to a report published in Clinical Infectious Diseases.
Even in the oldest age group – the population with the highest risk of developing flu complications and perhaps the weakest immune response – influenza vaccination prevented serious complications, said Fiona P. Havers, MD, of the influenza division, Centers for Disease Control and Prevention, Atlanta, and her associates.
Data on vaccine efficacy in older adults are sparse, and randomized, placebo-controlled trials to gather evidence would be unethical. Dr. Havers and her colleagues studied the issue using a case-control design, focusing on community-dwelling adults aged 50 years and older during the 2010-2011 flu season. They identified 368 patients across 10 states who were hospitalized for polymerase chain reaction–confirmed influenza and matched them for age and county of residence with 773 control subjects.
Hospitalized case-patients were less likely to have been vaccinated (55%) than were control subjects (63%). Thus, the flu vaccine reduced the risk of hospitalization for influenza by 56.8% overall.
Vaccination reduced hospitalization for influenza by 63.9% in the youngest age group (50-64 years), by 61.0% in the intermediate age group (65-74 years), and by 57.3% in the oldest age group (75 years and older).
These results are similar to those reported in other studies assessing the same time period, including one that evaluated vaccine efficacy in ambulatory adults in the United States and Europe. They also are consistent with the results of observational studies performed during different flu seasons, the investigators said (Clin Infect Dis. 2016 Aug 2. doi: 10.1093/cid/ciw512).
Compared with control subjects, case-patients were more likely to be of nonwhite race, to be of Hispanic ethnicity, to have a lower income, to have had fewer years of education, to have two or more chronic health conditions, to have required recent hospitalization for respiratory problems, to have impaired mobility, and to have lower functional status.
“These findings support current U.S. recommendations for annual influenza vaccination in older adults, especially in adults aged 65 and older who are at higher risk of influenza-associated complications,” Dr. Havers and her associates said.
The Centers for Disease Control and Prevention supported the study. Dr. Havers reported having no relevant financial disclosures; one of her associates reported ties to Genentech, Merck, Novavax, and Pfizer.
The seasonal influenza vaccination reduced flu-related hospitalizations by 56.8% among people aged 50 and older during a recent flu season, according to a report published in Clinical Infectious Diseases.
Even in the oldest age group – the population with the highest risk of developing flu complications and perhaps the weakest immune response – influenza vaccination prevented serious complications, said Fiona P. Havers, MD, of the influenza division, Centers for Disease Control and Prevention, Atlanta, and her associates.
Data on vaccine efficacy in older adults are sparse, and randomized, placebo-controlled trials to gather evidence would be unethical. Dr. Havers and her colleagues studied the issue using a case-control design, focusing on community-dwelling adults aged 50 years and older during the 2010-2011 flu season. They identified 368 patients across 10 states who were hospitalized for polymerase chain reaction–confirmed influenza and matched them for age and county of residence with 773 control subjects.
Hospitalized case-patients were less likely to have been vaccinated (55%) than were control subjects (63%). Thus, the flu vaccine reduced the risk of hospitalization for influenza by 56.8% overall.
Vaccination reduced hospitalization for influenza by 63.9% in the youngest age group (50-64 years), by 61.0% in the intermediate age group (65-74 years), and by 57.3% in the oldest age group (75 years and older).
These results are similar to those reported in other studies assessing the same time period, including one that evaluated vaccine efficacy in ambulatory adults in the United States and Europe. They also are consistent with the results of observational studies performed during different flu seasons, the investigators said (Clin Infect Dis. 2016 Aug 2. doi: 10.1093/cid/ciw512).
Compared with control subjects, case-patients were more likely to be of nonwhite race, to be of Hispanic ethnicity, to have a lower income, to have had fewer years of education, to have two or more chronic health conditions, to have required recent hospitalization for respiratory problems, to have impaired mobility, and to have lower functional status.
“These findings support current U.S. recommendations for annual influenza vaccination in older adults, especially in adults aged 65 and older who are at higher risk of influenza-associated complications,” Dr. Havers and her associates said.
The Centers for Disease Control and Prevention supported the study. Dr. Havers reported having no relevant financial disclosures; one of her associates reported ties to Genentech, Merck, Novavax, and Pfizer.
The seasonal influenza vaccination reduced flu-related hospitalizations by 56.8% among people aged 50 and older during a recent flu season, according to a report published in Clinical Infectious Diseases.
Even in the oldest age group – the population with the highest risk of developing flu complications and perhaps the weakest immune response – influenza vaccination prevented serious complications, said Fiona P. Havers, MD, of the influenza division, Centers for Disease Control and Prevention, Atlanta, and her associates.
Data on vaccine efficacy in older adults are sparse, and randomized, placebo-controlled trials to gather evidence would be unethical. Dr. Havers and her colleagues studied the issue using a case-control design, focusing on community-dwelling adults aged 50 years and older during the 2010-2011 flu season. They identified 368 patients across 10 states who were hospitalized for polymerase chain reaction–confirmed influenza and matched them for age and county of residence with 773 control subjects.
Hospitalized case-patients were less likely to have been vaccinated (55%) than were control subjects (63%). Thus, the flu vaccine reduced the risk of hospitalization for influenza by 56.8% overall.
Vaccination reduced hospitalization for influenza by 63.9% in the youngest age group (50-64 years), by 61.0% in the intermediate age group (65-74 years), and by 57.3% in the oldest age group (75 years and older).
These results are similar to those reported in other studies assessing the same time period, including one that evaluated vaccine efficacy in ambulatory adults in the United States and Europe. They also are consistent with the results of observational studies performed during different flu seasons, the investigators said (Clin Infect Dis. 2016 Aug 2. doi: 10.1093/cid/ciw512).
Compared with control subjects, case-patients were more likely to be of nonwhite race, to be of Hispanic ethnicity, to have a lower income, to have had fewer years of education, to have two or more chronic health conditions, to have required recent hospitalization for respiratory problems, to have impaired mobility, and to have lower functional status.
“These findings support current U.S. recommendations for annual influenza vaccination in older adults, especially in adults aged 65 and older who are at higher risk of influenza-associated complications,” Dr. Havers and her associates said.
The Centers for Disease Control and Prevention supported the study. Dr. Havers reported having no relevant financial disclosures; one of her associates reported ties to Genentech, Merck, Novavax, and Pfizer.
FROM CLINICAL INFECTIOUS DISEASES
Key clinical point: Seasonal influenza vaccination reduced flu-related hospitalizations by 56.8% in people aged 50 years and older.
Major finding: Vaccination reduced hospitalization for influenza by 63.9% in people aged 50-64 years, by 61.0% in those aged 65-74 years, and by 57.3% in those aged 75 years and older.
Data source: A retrospective case-control study involving 368 cases and 773 matched controls assessed during a single recent flu season.
Disclosures: The Centers for Disease Control and Prevention supported the study. Dr. Havers reported having no relevant financial disclosures; one of her associates reported ties to Genentech, Merck, Novavax, and Pfizer.
Pretransplantation mogamulizumab for ATLL raises risk of GVHD
The use of mogamulizumab before allogeneic hematopoietic stem-cell transplantation in aggressive adult T-cell leukemia/lymphoma is associated with an increased risk of acute graft-versus-host disease (GVHD), which leads to an inferior overall survival, investigators report in the Journal of Clinical Oncology.
Mogamulizumab is an anti-CCR4 monoclonal antibody that showed promise in small clinical studies when added to conventional chemotherapy as first-line treatment. It was recently approved for the treatment of adult T-cell leukemia/lymphoma in Japan, and eventually may be approved in the U.S. and other countries, said Shigeo Fuji, MD, of the department of hematopoietic stem-cell transplantation, National Cancer Center Hospital, Tokyo, and his associates.
The agent significantly depleted regulatory T cells for several months in animal models. This prompted concern regarding the possibility of exacerbating GVHD in human patients who don’t respond completely to first-line chemotherapy and then undergo stem-cell transplantation. “However, no direct evidence has demonstrated [regulatory T-cell] depletion in humans,” the investigators noted.
To examine this issue, they assessed clinical outcomes in a cohort of 996 patients across Japan who had aggressive adult T-cell leukemia/lymphoma, were aged 20-69 years, were diagnosed in 2000-2013, and received intensive, multiagent chemotherapy before undergoing allogeneic hematopoietic stem-cell transplantation.
Grade 2-4 acute GVHD developed in 381 of 873 patients who didn’t receive mogamulizumab (43.6%), compared with 47 of 81 patients who did receive the agent (58.0%), for a relative risk of 1.33 (P = .01). Grade 3-4 acute GVHD developed in 150 patients who didn’t receive mogamulizumab (17.2%), compared with 25 who did (30.9%), for an RR of 1.80 (P less than .01) .
The agent not only raised the rate of GVHD, it also increased the severity of the disorder. GVHD was refractory to systemic corticosteroids in 23.5% of patients who didn’t receive mogamulizumab, compared with 48.9% of those who did, for an RR of 2.09 (P less than .01), the investigators reported (J Clin Oncol. 2016. doi: 10.1200/JCO.2016.67.8250).
In addition, 1-year disease-free mortality was 25.1% without mogamulizumab, compared with 43.7% with it. The estimated 1-year overall survival was 49.4% without mogamulizumab, compared with 32.3% with it. And in multivariable analyses, receiving mogamulizumab before undergoing stem-cell transplantation was a significant risk factor for both disease-free mortality (hazard ratio, 1.93) and overall mortality (HR, 1.67).
“All hematologists should take the risks and benefits of mogamulizumab into consideration before they use [it] in transplantation-eligible patients,” Dr. Fuji and his associates said.
The use of mogamulizumab before allogeneic hematopoietic stem-cell transplantation in aggressive adult T-cell leukemia/lymphoma is associated with an increased risk of acute graft-versus-host disease (GVHD), which leads to an inferior overall survival, investigators report in the Journal of Clinical Oncology.
Mogamulizumab is an anti-CCR4 monoclonal antibody that showed promise in small clinical studies when added to conventional chemotherapy as first-line treatment. It was recently approved for the treatment of adult T-cell leukemia/lymphoma in Japan, and eventually may be approved in the U.S. and other countries, said Shigeo Fuji, MD, of the department of hematopoietic stem-cell transplantation, National Cancer Center Hospital, Tokyo, and his associates.
The agent significantly depleted regulatory T cells for several months in animal models. This prompted concern regarding the possibility of exacerbating GVHD in human patients who don’t respond completely to first-line chemotherapy and then undergo stem-cell transplantation. “However, no direct evidence has demonstrated [regulatory T-cell] depletion in humans,” the investigators noted.
To examine this issue, they assessed clinical outcomes in a cohort of 996 patients across Japan who had aggressive adult T-cell leukemia/lymphoma, were aged 20-69 years, were diagnosed in 2000-2013, and received intensive, multiagent chemotherapy before undergoing allogeneic hematopoietic stem-cell transplantation.
Grade 2-4 acute GVHD developed in 381 of 873 patients who didn’t receive mogamulizumab (43.6%), compared with 47 of 81 patients who did receive the agent (58.0%), for a relative risk of 1.33 (P = .01). Grade 3-4 acute GVHD developed in 150 patients who didn’t receive mogamulizumab (17.2%), compared with 25 who did (30.9%), for an RR of 1.80 (P less than .01) .
The agent not only raised the rate of GVHD, it also increased the severity of the disorder. GVHD was refractory to systemic corticosteroids in 23.5% of patients who didn’t receive mogamulizumab, compared with 48.9% of those who did, for an RR of 2.09 (P less than .01), the investigators reported (J Clin Oncol. 2016. doi: 10.1200/JCO.2016.67.8250).
In addition, 1-year disease-free mortality was 25.1% without mogamulizumab, compared with 43.7% with it. The estimated 1-year overall survival was 49.4% without mogamulizumab, compared with 32.3% with it. And in multivariable analyses, receiving mogamulizumab before undergoing stem-cell transplantation was a significant risk factor for both disease-free mortality (hazard ratio, 1.93) and overall mortality (HR, 1.67).
“All hematologists should take the risks and benefits of mogamulizumab into consideration before they use [it] in transplantation-eligible patients,” Dr. Fuji and his associates said.
The use of mogamulizumab before allogeneic hematopoietic stem-cell transplantation in aggressive adult T-cell leukemia/lymphoma is associated with an increased risk of acute graft-versus-host disease (GVHD), which leads to an inferior overall survival, investigators report in the Journal of Clinical Oncology.
Mogamulizumab is an anti-CCR4 monoclonal antibody that showed promise in small clinical studies when added to conventional chemotherapy as first-line treatment. It was recently approved for the treatment of adult T-cell leukemia/lymphoma in Japan, and eventually may be approved in the U.S. and other countries, said Shigeo Fuji, MD, of the department of hematopoietic stem-cell transplantation, National Cancer Center Hospital, Tokyo, and his associates.
The agent significantly depleted regulatory T cells for several months in animal models. This prompted concern regarding the possibility of exacerbating GVHD in human patients who don’t respond completely to first-line chemotherapy and then undergo stem-cell transplantation. “However, no direct evidence has demonstrated [regulatory T-cell] depletion in humans,” the investigators noted.
To examine this issue, they assessed clinical outcomes in a cohort of 996 patients across Japan who had aggressive adult T-cell leukemia/lymphoma, were aged 20-69 years, were diagnosed in 2000-2013, and received intensive, multiagent chemotherapy before undergoing allogeneic hematopoietic stem-cell transplantation.
Grade 2-4 acute GVHD developed in 381 of 873 patients who didn’t receive mogamulizumab (43.6%), compared with 47 of 81 patients who did receive the agent (58.0%), for a relative risk of 1.33 (P = .01). Grade 3-4 acute GVHD developed in 150 patients who didn’t receive mogamulizumab (17.2%), compared with 25 who did (30.9%), for an RR of 1.80 (P less than .01) .
The agent not only raised the rate of GVHD, it also increased the severity of the disorder. GVHD was refractory to systemic corticosteroids in 23.5% of patients who didn’t receive mogamulizumab, compared with 48.9% of those who did, for an RR of 2.09 (P less than .01), the investigators reported (J Clin Oncol. 2016. doi: 10.1200/JCO.2016.67.8250).
In addition, 1-year disease-free mortality was 25.1% without mogamulizumab, compared with 43.7% with it. The estimated 1-year overall survival was 49.4% without mogamulizumab, compared with 32.3% with it. And in multivariable analyses, receiving mogamulizumab before undergoing stem-cell transplantation was a significant risk factor for both disease-free mortality (hazard ratio, 1.93) and overall mortality (HR, 1.67).
“All hematologists should take the risks and benefits of mogamulizumab into consideration before they use [it] in transplantation-eligible patients,” Dr. Fuji and his associates said.
FROM THE JOURNAL OF CLINICAL ONCOLOGY
Key clinical point: The use of mogamulizumab before allogeneic hematopoietic stem-cell transplantation in aggressive adult T-cell leukemia/lymphoma was associated with GVHD and increased mortality.
Major finding: Grade 3-4 acute GVHD developed in 17.2% of patients who didn’t receive mogamulizumab, compared with 30.9% who did, for a relative risk of 1.80.
Data source: A retrospective cohort study involving 996 patients with adult T-cell leukemia/lymphoma in Japan.
Disclosures: This study was supported in part by Practical Research for Innovative Cancer Control and the Japan Agency for Medical Research and Development. Dr. Fuji and one associate reported receiving honoraria from Kyowa Hakko Kirin; another associate reported ties to numerous industry sources.
Evidence doesn’t support tight glycemic control
The scientific evidence does not support tight glycemic control as a means to prevent the complications of type 2 diabetes, even though most clinical practice guidelines, quality-of-care measures, quality improvement interventions, and academic and clinical statements unequivocally endorse tight glycemic control for that purpose, according to a report published online Aug. 23 in Circulation: Cardiovascular Quality and Outcomes.
There is an enormous disconnect between the widespread consensus that tight glycemic control is essential on the one hand, and the overwhelming data demonstrating that it doesn’t prevent 10 of the 11 micro- and macrovascular complications that matter most to patients on the other hand. “This consensus and its downstream consequences to practice, policy, and research” must be recalibrated, said Rene Rodriguez-Gutierrez, MD, and Victor M. Montori, MD, both of the Knowledge and Evaluation Research Unit, Division of Endocrinology, Mayo Clinic, Rochester, Minn.
They systematically reviewed the current evidence regarding tight glycemic control (achieving hemoglobin A1c under 7%) published in the five “most impactful” general medical journals (the New England Journal of Medicine, the Lancet, JAMA, the British Medical Journal, and Annals of Internal Medicine) and the two most impactful specialty journals (Diabetes Care and the Journal of the American College of Cardiology) between 2006 and 2015. This included 328 research articles, 16 sets of treatment guidelines, 11 meta-analyses, and five large, randomized clinical trials and their extension studies, as well as relevant letters, commentaries, and editorials. They also reviewed national guidelines and standards of care published in all languages during the study period.
The investigators focused on the effect of tight glycemic control, as opposed to looser control, on 11 outcomes most important to patients: end-stage renal disease or the need for dialysis, renal death, blindness, clinical neuropathy, microalbuminuria, retinal photocoagulation, all-cause mortality, cardiovascular mortality, nonfatal MI, fatal and nonfatal stroke, and peripheral vascular events or amputations.
Regarding the microvascular complications, good evidence shows that tight glycemic control has no significant impact on the risk of end-stage renal disease, renal death, blindness, or clinical neuropathy, and that there is no threshold HbA1c effect on risk. Moreover, the incidence of such complications is very low (less than 6%). Nevertheless, “practice guidelines and published statements offer a consistent and confident consensus, with 100% of the guidelines and 77%-100% of academic and clinical statements in favor of tight glycemic control to prevent microvascular complications,” according to Dr. Rodriguez-Gutierrez and Dr. Montori (Circ Cardiovasc Qual Outcomes. 2016 Aug 23;9:00-00. doi: 10.1161/CIRCOUTCOMES.116.002901).
Regarding the macrovascular complications, the evidence consistently shows that tight glycemic control exerts no significant effect on all-cause or cardiovascular mortality or on fatal or nonfatal stroke. The putative protective effect reported on amputations is “imprecise,” as it is based on very few such events. The only protective effect of tight glycemic control in this category of complications is that it reduces the risk of nonfatal MI by 15%.
Since the publication of the ACCORD trial, which clearly questioned the ability of tight glycemic control to prevent macrovascular complications, the consensus on this point has “withered.” At present, 64%-79% of published statements now express “uncertainty and skepticism” that tight glycemic control is essential. Yet two sets of guidelines – the American Diabetes Association standards published in 2003 and 2004 – did so.
The study findings indicate that despite good evidence to the contrary, the unsupported “consensus” on tight glycemic control drives most guidelines and quality-of-care interventions. It also underlies “the Food and Drug Administration policy to approve diabetes mellitus drugs only on the basis of their antihyperglycemic effect, without requiring evidence of reduction in the risk of complications,” the investigators said.
“This consensus is also driving studies such as the National Institutes of Health–funded GRADE trial comparing antihyperglycemic drugs on their ability to reduce HbA1c, rather than to reduce the risk of diabetes complications,” they added.
The narrow focus on tight glycemic control has undercut research on other possible interventions to prevent these complications. There are zero trials currently under way assessing treatment possibilities other than drugs that reduce hyperglycemia, and there are zero evidence-based therapies either mentioned in guidelines or routinely prescribed to patients for preventing these complications, Dr. Rodriguez-Gutierrez and Dr. Montori wrote.
“A careful and thoughtful recalibration” is needed. “Today, patients with type 2 diabetes, at least in some parts of the world, seem to live longer lives with fewer complications. The evidence summarized here requires us to explore factors other than tight glycemic control to explain this improvement and better address the diabetes epidemic,” they noted.
This study was supported by the National Center for Advancing Translational Sciences, a component of the National Institutes of Health. Dr. Rodriguez-Gutierrez and Dr. Montori reported having no relevant financial disclosures.
The scientific evidence does not support tight glycemic control as a means to prevent the complications of type 2 diabetes, even though most clinical practice guidelines, quality-of-care measures, quality improvement interventions, and academic and clinical statements unequivocally endorse tight glycemic control for that purpose, according to a report published online Aug. 23 in Circulation: Cardiovascular Quality and Outcomes.
There is an enormous disconnect between the widespread consensus that tight glycemic control is essential on the one hand, and the overwhelming data demonstrating that it doesn’t prevent 10 of the 11 micro- and macrovascular complications that matter most to patients on the other hand. “This consensus and its downstream consequences to practice, policy, and research” must be recalibrated, said Rene Rodriguez-Gutierrez, MD, and Victor M. Montori, MD, both of the Knowledge and Evaluation Research Unit, Division of Endocrinology, Mayo Clinic, Rochester, Minn.
They systematically reviewed the current evidence regarding tight glycemic control (achieving hemoglobin A1c under 7%) published in the five “most impactful” general medical journals (the New England Journal of Medicine, the Lancet, JAMA, the British Medical Journal, and Annals of Internal Medicine) and the two most impactful specialty journals (Diabetes Care and the Journal of the American College of Cardiology) between 2006 and 2015. This included 328 research articles, 16 sets of treatment guidelines, 11 meta-analyses, and five large, randomized clinical trials and their extension studies, as well as relevant letters, commentaries, and editorials. They also reviewed national guidelines and standards of care published in all languages during the study period.
The investigators focused on the effect of tight glycemic control, as opposed to looser control, on 11 outcomes most important to patients: end-stage renal disease or the need for dialysis, renal death, blindness, clinical neuropathy, microalbuminuria, retinal photocoagulation, all-cause mortality, cardiovascular mortality, nonfatal MI, fatal and nonfatal stroke, and peripheral vascular events or amputations.
Regarding the microvascular complications, good evidence shows that tight glycemic control has no significant impact on the risk of end-stage renal disease, renal death, blindness, or clinical neuropathy, and that there is no threshold HbA1c effect on risk. Moreover, the incidence of such complications is very low (less than 6%). Nevertheless, “practice guidelines and published statements offer a consistent and confident consensus, with 100% of the guidelines and 77%-100% of academic and clinical statements in favor of tight glycemic control to prevent microvascular complications,” according to Dr. Rodriguez-Gutierrez and Dr. Montori (Circ Cardiovasc Qual Outcomes. 2016 Aug 23;9:00-00. doi: 10.1161/CIRCOUTCOMES.116.002901).
Regarding the macrovascular complications, the evidence consistently shows that tight glycemic control exerts no significant effect on all-cause or cardiovascular mortality or on fatal or nonfatal stroke. The putative protective effect reported on amputations is “imprecise,” as it is based on very few such events. The only protective effect of tight glycemic control in this category of complications is that it reduces the risk of nonfatal MI by 15%.
Since the publication of the ACCORD trial, which clearly questioned the ability of tight glycemic control to prevent macrovascular complications, the consensus on this point has “withered.” At present, 64%-79% of published statements now express “uncertainty and skepticism” that tight glycemic control is essential. Yet two sets of guidelines – the American Diabetes Association standards published in 2003 and 2004 – did so.
The study findings indicate that despite good evidence to the contrary, the unsupported “consensus” on tight glycemic control drives most guidelines and quality-of-care interventions. It also underlies “the Food and Drug Administration policy to approve diabetes mellitus drugs only on the basis of their antihyperglycemic effect, without requiring evidence of reduction in the risk of complications,” the investigators said.
“This consensus is also driving studies such as the National Institutes of Health–funded GRADE trial comparing antihyperglycemic drugs on their ability to reduce HbA1c, rather than to reduce the risk of diabetes complications,” they added.
The narrow focus on tight glycemic control has undercut research on other possible interventions to prevent these complications. There are zero trials currently under way assessing treatment possibilities other than drugs that reduce hyperglycemia, and there are zero evidence-based therapies either mentioned in guidelines or routinely prescribed to patients for preventing these complications, Dr. Rodriguez-Gutierrez and Dr. Montori wrote.
“A careful and thoughtful recalibration” is needed. “Today, patients with type 2 diabetes, at least in some parts of the world, seem to live longer lives with fewer complications. The evidence summarized here requires us to explore factors other than tight glycemic control to explain this improvement and better address the diabetes epidemic,” they noted.
This study was supported by the National Center for Advancing Translational Sciences, a component of the National Institutes of Health. Dr. Rodriguez-Gutierrez and Dr. Montori reported having no relevant financial disclosures.
The scientific evidence does not support tight glycemic control as a means to prevent the complications of type 2 diabetes, even though most clinical practice guidelines, quality-of-care measures, quality improvement interventions, and academic and clinical statements unequivocally endorse tight glycemic control for that purpose, according to a report published online Aug. 23 in Circulation: Cardiovascular Quality and Outcomes.
There is an enormous disconnect between the widespread consensus that tight glycemic control is essential on the one hand, and the overwhelming data demonstrating that it doesn’t prevent 10 of the 11 micro- and macrovascular complications that matter most to patients on the other hand. “This consensus and its downstream consequences to practice, policy, and research” must be recalibrated, said Rene Rodriguez-Gutierrez, MD, and Victor M. Montori, MD, both of the Knowledge and Evaluation Research Unit, Division of Endocrinology, Mayo Clinic, Rochester, Minn.
They systematically reviewed the current evidence regarding tight glycemic control (achieving hemoglobin A1c under 7%) published in the five “most impactful” general medical journals (the New England Journal of Medicine, the Lancet, JAMA, the British Medical Journal, and Annals of Internal Medicine) and the two most impactful specialty journals (Diabetes Care and the Journal of the American College of Cardiology) between 2006 and 2015. This included 328 research articles, 16 sets of treatment guidelines, 11 meta-analyses, and five large, randomized clinical trials and their extension studies, as well as relevant letters, commentaries, and editorials. They also reviewed national guidelines and standards of care published in all languages during the study period.
The investigators focused on the effect of tight glycemic control, as opposed to looser control, on 11 outcomes most important to patients: end-stage renal disease or the need for dialysis, renal death, blindness, clinical neuropathy, microalbuminuria, retinal photocoagulation, all-cause mortality, cardiovascular mortality, nonfatal MI, fatal and nonfatal stroke, and peripheral vascular events or amputations.
Regarding the microvascular complications, good evidence shows that tight glycemic control has no significant impact on the risk of end-stage renal disease, renal death, blindness, or clinical neuropathy, and that there is no threshold HbA1c effect on risk. Moreover, the incidence of such complications is very low (less than 6%). Nevertheless, “practice guidelines and published statements offer a consistent and confident consensus, with 100% of the guidelines and 77%-100% of academic and clinical statements in favor of tight glycemic control to prevent microvascular complications,” according to Dr. Rodriguez-Gutierrez and Dr. Montori (Circ Cardiovasc Qual Outcomes. 2016 Aug 23;9:00-00. doi: 10.1161/CIRCOUTCOMES.116.002901).
Regarding the macrovascular complications, the evidence consistently shows that tight glycemic control exerts no significant effect on all-cause or cardiovascular mortality or on fatal or nonfatal stroke. The putative protective effect reported on amputations is “imprecise,” as it is based on very few such events. The only protective effect of tight glycemic control in this category of complications is that it reduces the risk of nonfatal MI by 15%.
Since the publication of the ACCORD trial, which clearly questioned the ability of tight glycemic control to prevent macrovascular complications, the consensus on this point has “withered.” At present, 64%-79% of published statements now express “uncertainty and skepticism” that tight glycemic control is essential. Yet two sets of guidelines – the American Diabetes Association standards published in 2003 and 2004 – did so.
The study findings indicate that despite good evidence to the contrary, the unsupported “consensus” on tight glycemic control drives most guidelines and quality-of-care interventions. It also underlies “the Food and Drug Administration policy to approve diabetes mellitus drugs only on the basis of their antihyperglycemic effect, without requiring evidence of reduction in the risk of complications,” the investigators said.
“This consensus is also driving studies such as the National Institutes of Health–funded GRADE trial comparing antihyperglycemic drugs on their ability to reduce HbA1c, rather than to reduce the risk of diabetes complications,” they added.
The narrow focus on tight glycemic control has undercut research on other possible interventions to prevent these complications. There are zero trials currently under way assessing treatment possibilities other than drugs that reduce hyperglycemia, and there are zero evidence-based therapies either mentioned in guidelines or routinely prescribed to patients for preventing these complications, Dr. Rodriguez-Gutierrez and Dr. Montori wrote.
“A careful and thoughtful recalibration” is needed. “Today, patients with type 2 diabetes, at least in some parts of the world, seem to live longer lives with fewer complications. The evidence summarized here requires us to explore factors other than tight glycemic control to explain this improvement and better address the diabetes epidemic,” they noted.
This study was supported by the National Center for Advancing Translational Sciences, a component of the National Institutes of Health. Dr. Rodriguez-Gutierrez and Dr. Montori reported having no relevant financial disclosures.
FROM CIRCULATION: CARDIOVASCULAR QUALITY AND OUTCOMES
Key clinical point: The scientific evidence doesn’t support tight glycemic control to prevent the complications of type 2 diabetes.
Major finding: All current practice guidelines and the vast majority of published academic and clinical statements endorse tight glycemic control to prevent microvascular complications.
Data source: A systematic review of 328 research articles, 16 treatment guidelines, 11 meta-analyses, five randomized controlled tests, reviews, letters, commentaries, editorials, and standards of care published during 2006-2015.
Disclosures: This study was supported by the National Center for Advancing Translational Sciences, a component of the National Institutes of Health. Dr. Rodriguez-Gutierrez and Dr. Montori reported having no relevant financial disclosures.
Power morcellation dropped, abdominal hysterectomy increased after FDA warning
Electric power morcellation during hysterectomy declined sharply after the Food and Drug Administration discouraged use of the technique in April 2014 and then recommended against it for perimenopausal and postmenopausal women in November 2014. At the same time, use of abdominal hysterectomy increased, according to a new analysis.
The FDA took these actions because of concern that intraoperative morcellation could inadvertently expose healthy abdominal tissue to contamination from occult uterine malignancies. But some clinicians warned that avoiding morcellation would lead to a greater number of hysterectomies via laparotomy, with an attendant increase in surgical complications.
To assess the effect of the FDA recommendations, Jason D. Wright, MD, of the division of gynecologic oncology, Columbia University, New York, and his colleagues analyzed time trends in hysterectomy and morcellation during nine 3-month periods before and after the FDA announcements. They used information from a national database that covers more than 500 hospitals across the country, including in their analysis 203,520 women aged 18-95 years (mean age, 48 years) who underwent hysterectomy during the study period.
Among the 117,653 minimally invasive hysterectomies performed, the use of electric power morcellation rose slightly during 2013, peaking at 13.7% in the fourth quarter of that year. It then declined precipitously, to a low of 2.8% by the last 3-month period assessed, which was the first quarter of 2015. Simultaneously, the use of abdominal hysterectomy increased from 27.1% of procedures in early 2013 to 31.8% by the last 3-month period assessed, the researchers reported (JAMA 2016;316:877-8).
However, despite the increase in abdominal procedures, the complication rate did not change over time. It was 8.3% during the first quarter studied and 8.4% during the last. In fact, the rate of complications during abdominal hysterectomy also declined, from 18.4% to 17.6%. Similarly, the rate of complications during minimally invasive hysterectomy dropped from 4.4% to 4.1%, and the complication rate decreased from 4.7% to 4.2% during vaginal hysterectomy.
The researchers noted that the prevalence of uterine cancer, endometrial hyperplasia, other gynecologic cancers, and uterine tumors of indeterminate behavior were unchanged during the study period among women who underwent minimally invasive hysterectomy with power morcellation.
“The FDA warnings might result in a lower prevalence of cancer among women who underwent morcellation due to greater scrutiny on patient selection. However, the high rate of abnormal pathology after the warnings highlights the difficulty in the preoperative detection of uterine pathology,” the researchers wrote. “Continued caution is needed to limit the inadvertent morcellation of uterine pathology.”
The National Cancer Institute funded the study. The researchers reported having no relevant financial disclosures.
I read with interest “Trends in Use and Outcomes of Women Undergoing Hysterectomy With Electric Power Morcellation,” by Jason D. Wright and colleagues, published in JAMA. As expected, with the concerns raised by the FDA regarding electric power morcellation, there has been a statistically significant reduction in a laparoscopic approach to hysterectomy (59.2% to 56.2%) and an even more marked decrease in use of electric power morcellation (13.7% to 2.8%).
Dr. Charles E. Miller |
Interestingly, the increase in abdominal hysterectomy (27.1% to 31.8%) appears to be secondary not only to the reduction in minimally invasive gynecologic surgery, but vaginal hysterectomy as well. While the decrease in vaginal hysterectomy, is, in part, likely due to the trend away from this technique, it is probably also due to the concern of cutting up a potential sarcoma during the procedure to deliver the large fibroid uterus. This would be supported by the fact that the greatest percentage drop in vaginal hysterectomy occurred in Q1-Q2 2014 at the time the FDA issued its safety concern.
In a study by Matthew Siedhoff, MD, et al. (Am J Obstet Gynecol. 2015 May;212[5]:591.e1-8), utilizing a decision tree model, Dr. Siedhoff anticipated that severe complications would increase with conversion of a minimally invasive approach to laparotomy. While according to Wright et al., there is no increase noted in complications per se, there is no indication as to severity of complications. Thus, while overall complications have not increased, perhaps severe complications have, in fact, increased as anticipated by Siedhoff et al. Furthermore, impact on quality of life is not considered (i.e. hospitalization, convalescence at home, etc.) in this study, but is a well known difference between minimally invasive surgery and open surgery.
Charles E. Miller, MD, is a clinical associate professor at the University of Illinois, and past president of the AAGL. He is a reproductive endocrinologist and minimally invasive gynecologic surgeon in private practice in Naperville and Schaumburg, Ill. Dr. Miller reported that he is working on a study with Espiner Medical Ltd. to evaluate the safety and efficacy of a bag that is utilized for contained electric power morcellation. Karl Storz is sponsoring the study.
I read with interest “Trends in Use and Outcomes of Women Undergoing Hysterectomy With Electric Power Morcellation,” by Jason D. Wright and colleagues, published in JAMA. As expected, with the concerns raised by the FDA regarding electric power morcellation, there has been a statistically significant reduction in a laparoscopic approach to hysterectomy (59.2% to 56.2%) and an even more marked decrease in use of electric power morcellation (13.7% to 2.8%).
Dr. Charles E. Miller |
Interestingly, the increase in abdominal hysterectomy (27.1% to 31.8%) appears to be secondary not only to the reduction in minimally invasive gynecologic surgery, but vaginal hysterectomy as well. While the decrease in vaginal hysterectomy, is, in part, likely due to the trend away from this technique, it is probably also due to the concern of cutting up a potential sarcoma during the procedure to deliver the large fibroid uterus. This would be supported by the fact that the greatest percentage drop in vaginal hysterectomy occurred in Q1-Q2 2014 at the time the FDA issued its safety concern.
In a study by Matthew Siedhoff, MD, et al. (Am J Obstet Gynecol. 2015 May;212[5]:591.e1-8), utilizing a decision tree model, Dr. Siedhoff anticipated that severe complications would increase with conversion of a minimally invasive approach to laparotomy. While according to Wright et al., there is no increase noted in complications per se, there is no indication as to severity of complications. Thus, while overall complications have not increased, perhaps severe complications have, in fact, increased as anticipated by Siedhoff et al. Furthermore, impact on quality of life is not considered (i.e. hospitalization, convalescence at home, etc.) in this study, but is a well known difference between minimally invasive surgery and open surgery.
Charles E. Miller, MD, is a clinical associate professor at the University of Illinois, and past president of the AAGL. He is a reproductive endocrinologist and minimally invasive gynecologic surgeon in private practice in Naperville and Schaumburg, Ill. Dr. Miller reported that he is working on a study with Espiner Medical Ltd. to evaluate the safety and efficacy of a bag that is utilized for contained electric power morcellation. Karl Storz is sponsoring the study.
I read with interest “Trends in Use and Outcomes of Women Undergoing Hysterectomy With Electric Power Morcellation,” by Jason D. Wright and colleagues, published in JAMA. As expected, with the concerns raised by the FDA regarding electric power morcellation, there has been a statistically significant reduction in a laparoscopic approach to hysterectomy (59.2% to 56.2%) and an even more marked decrease in use of electric power morcellation (13.7% to 2.8%).
Dr. Charles E. Miller |
Interestingly, the increase in abdominal hysterectomy (27.1% to 31.8%) appears to be secondary not only to the reduction in minimally invasive gynecologic surgery, but vaginal hysterectomy as well. While the decrease in vaginal hysterectomy, is, in part, likely due to the trend away from this technique, it is probably also due to the concern of cutting up a potential sarcoma during the procedure to deliver the large fibroid uterus. This would be supported by the fact that the greatest percentage drop in vaginal hysterectomy occurred in Q1-Q2 2014 at the time the FDA issued its safety concern.
In a study by Matthew Siedhoff, MD, et al. (Am J Obstet Gynecol. 2015 May;212[5]:591.e1-8), utilizing a decision tree model, Dr. Siedhoff anticipated that severe complications would increase with conversion of a minimally invasive approach to laparotomy. While according to Wright et al., there is no increase noted in complications per se, there is no indication as to severity of complications. Thus, while overall complications have not increased, perhaps severe complications have, in fact, increased as anticipated by Siedhoff et al. Furthermore, impact on quality of life is not considered (i.e. hospitalization, convalescence at home, etc.) in this study, but is a well known difference between minimally invasive surgery and open surgery.
Charles E. Miller, MD, is a clinical associate professor at the University of Illinois, and past president of the AAGL. He is a reproductive endocrinologist and minimally invasive gynecologic surgeon in private practice in Naperville and Schaumburg, Ill. Dr. Miller reported that he is working on a study with Espiner Medical Ltd. to evaluate the safety and efficacy of a bag that is utilized for contained electric power morcellation. Karl Storz is sponsoring the study.
Electric power morcellation during hysterectomy declined sharply after the Food and Drug Administration discouraged use of the technique in April 2014 and then recommended against it for perimenopausal and postmenopausal women in November 2014. At the same time, use of abdominal hysterectomy increased, according to a new analysis.
The FDA took these actions because of concern that intraoperative morcellation could inadvertently expose healthy abdominal tissue to contamination from occult uterine malignancies. But some clinicians warned that avoiding morcellation would lead to a greater number of hysterectomies via laparotomy, with an attendant increase in surgical complications.
To assess the effect of the FDA recommendations, Jason D. Wright, MD, of the division of gynecologic oncology, Columbia University, New York, and his colleagues analyzed time trends in hysterectomy and morcellation during nine 3-month periods before and after the FDA announcements. They used information from a national database that covers more than 500 hospitals across the country, including in their analysis 203,520 women aged 18-95 years (mean age, 48 years) who underwent hysterectomy during the study period.
Among the 117,653 minimally invasive hysterectomies performed, the use of electric power morcellation rose slightly during 2013, peaking at 13.7% in the fourth quarter of that year. It then declined precipitously, to a low of 2.8% by the last 3-month period assessed, which was the first quarter of 2015. Simultaneously, the use of abdominal hysterectomy increased from 27.1% of procedures in early 2013 to 31.8% by the last 3-month period assessed, the researchers reported (JAMA 2016;316:877-8).
However, despite the increase in abdominal procedures, the complication rate did not change over time. It was 8.3% during the first quarter studied and 8.4% during the last. In fact, the rate of complications during abdominal hysterectomy also declined, from 18.4% to 17.6%. Similarly, the rate of complications during minimally invasive hysterectomy dropped from 4.4% to 4.1%, and the complication rate decreased from 4.7% to 4.2% during vaginal hysterectomy.
The researchers noted that the prevalence of uterine cancer, endometrial hyperplasia, other gynecologic cancers, and uterine tumors of indeterminate behavior were unchanged during the study period among women who underwent minimally invasive hysterectomy with power morcellation.
“The FDA warnings might result in a lower prevalence of cancer among women who underwent morcellation due to greater scrutiny on patient selection. However, the high rate of abnormal pathology after the warnings highlights the difficulty in the preoperative detection of uterine pathology,” the researchers wrote. “Continued caution is needed to limit the inadvertent morcellation of uterine pathology.”
The National Cancer Institute funded the study. The researchers reported having no relevant financial disclosures.
Electric power morcellation during hysterectomy declined sharply after the Food and Drug Administration discouraged use of the technique in April 2014 and then recommended against it for perimenopausal and postmenopausal women in November 2014. At the same time, use of abdominal hysterectomy increased, according to a new analysis.
The FDA took these actions because of concern that intraoperative morcellation could inadvertently expose healthy abdominal tissue to contamination from occult uterine malignancies. But some clinicians warned that avoiding morcellation would lead to a greater number of hysterectomies via laparotomy, with an attendant increase in surgical complications.
To assess the effect of the FDA recommendations, Jason D. Wright, MD, of the division of gynecologic oncology, Columbia University, New York, and his colleagues analyzed time trends in hysterectomy and morcellation during nine 3-month periods before and after the FDA announcements. They used information from a national database that covers more than 500 hospitals across the country, including in their analysis 203,520 women aged 18-95 years (mean age, 48 years) who underwent hysterectomy during the study period.
Among the 117,653 minimally invasive hysterectomies performed, the use of electric power morcellation rose slightly during 2013, peaking at 13.7% in the fourth quarter of that year. It then declined precipitously, to a low of 2.8% by the last 3-month period assessed, which was the first quarter of 2015. Simultaneously, the use of abdominal hysterectomy increased from 27.1% of procedures in early 2013 to 31.8% by the last 3-month period assessed, the researchers reported (JAMA 2016;316:877-8).
However, despite the increase in abdominal procedures, the complication rate did not change over time. It was 8.3% during the first quarter studied and 8.4% during the last. In fact, the rate of complications during abdominal hysterectomy also declined, from 18.4% to 17.6%. Similarly, the rate of complications during minimally invasive hysterectomy dropped from 4.4% to 4.1%, and the complication rate decreased from 4.7% to 4.2% during vaginal hysterectomy.
The researchers noted that the prevalence of uterine cancer, endometrial hyperplasia, other gynecologic cancers, and uterine tumors of indeterminate behavior were unchanged during the study period among women who underwent minimally invasive hysterectomy with power morcellation.
“The FDA warnings might result in a lower prevalence of cancer among women who underwent morcellation due to greater scrutiny on patient selection. However, the high rate of abnormal pathology after the warnings highlights the difficulty in the preoperative detection of uterine pathology,” the researchers wrote. “Continued caution is needed to limit the inadvertent morcellation of uterine pathology.”
The National Cancer Institute funded the study. The researchers reported having no relevant financial disclosures.
FROM JAMA
Key clinical point: Electric power morcellation declined after the FDA recommended against using the technique during hysterectomy.
Major finding: Use of electric power morcellation peaked at 13.7% before the FDA recommendations, then declined to a low of 2.8%.
Data source: A retrospective database analysis involving 203,520 hysterectomies performed at more than 500 U.S. hospitals during 2013-2015.
Disclosures: The National Cancer Institute funded the study. The researchers reported having no relevant financial disclosures.
Breast density is key to appropriate screening intervals
Breast density is an important factor in determining the appropriate screening intervals for mammography after age 50 years, according to a report published online Aug. 22 in Annals of Internal Medicine.
Researchers from the Cancer Intervention and Surveillance Modeling Network, collaborating with the Breast Cancer Surveillance Consortium, assessed three separate, well-established microsimulation models that used different structures and underlying assumptions but the same data input to estimate the benefits and harms of various screening intervals. They applied the models to two hypothetical populations: Women aged 50 years and older who were initiating screening for the first time and women aged 65 years who had undergone biennial screening since age 50 years.
The models incorporated national data regarding breast cancer incidence, treatment efficacy, and survival. They assessed patient risk by including numerous factors, such as menopausal status, obesity status, age at menarche, nulliparity, and previous biopsy results, but didn’t include family history or genetic testing results. Screening strategies were compared among four possible breast-density levels, according to the American College of Radiology’s Breast Imaging Reporting and Data System (BI-RADS).
The principal finding was that two factors – breast density and risk for breast cancer – were key to determining the optimal screening interval. The optimal interval was the one that would yield the highest number of benefits (breast cancer deaths averted, life-years gained, and quality-adjusted life-years gained) while yielding the lowest number of harms (false-positive mammograms, benign biopsies, and overdiagnosis).
“For average-risk women in low-density subgroups, who comprise a large portion of the population, triennial screening provides a reasonable balance of benefits and harms and is cost effective. Annual screening has a favorable balance of benefits and harms and would be considered cost effective for subgroups of women ... with risk levels that are two to four times the average and with heterogeneously or extremely dense breasts,” the researchers wrote (Ann.Intern Med. 2016 Aug 22. doi: 10.7326/M16-0476).
After age 50 years, annual mammography was more beneficial than harmful only in two subgroups of women: those with greater breast density and those with higher risk for breast cancer. Such women are estimated to comprise less than 1% of the general population at both age 50 years and age 65 years. In contrast, biennial and even triennial mammography yielded fewer false-positives and fewer biopsies for average-risk women with low-density breasts without affecting the number of breast cancer deaths averted, the researchers noted.
The study was supported by grants from the National Institutes of Health and several state public health departments and cancer registries in the United States. The researchers reported receiving grants and other support from the NIH, the American Society of Breast Surgeons, Renaissance Rx, Ally Clinical Diagnostics, the Netherlands National Institute for Public Health and the Environment, SCOR Global Risk Center, and Genomic Health Canada.
The U.S. Preventive Services Task Force made a grade B recommendation for biennial mammography screening in average-risk women aged 50 to 74 years. This current work from the well-regarded Cancer Intervention and Surveillance Modeling Network and Breast Cancer Surveillance Consortium investigators helps women and clinicians to possibly individualize screening frequency based on risk and BI-RADS categories. It will be important to track outcomes in women who undergo alternative screening frequencies to validate this approach.
Christine D. Berg, MD, is in the department of radiation oncology at Johns Hopkins Hospital, Baltimore. She reported receiving personal fees from Medial Early Sign. These comments are excerpted from an editorial accompanying Dr. Trentham-Dietz’s report (Ann Intern Med. 2016 Aug 22. doi: 10.7326/M16-1791).
The U.S. Preventive Services Task Force made a grade B recommendation for biennial mammography screening in average-risk women aged 50 to 74 years. This current work from the well-regarded Cancer Intervention and Surveillance Modeling Network and Breast Cancer Surveillance Consortium investigators helps women and clinicians to possibly individualize screening frequency based on risk and BI-RADS categories. It will be important to track outcomes in women who undergo alternative screening frequencies to validate this approach.
Christine D. Berg, MD, is in the department of radiation oncology at Johns Hopkins Hospital, Baltimore. She reported receiving personal fees from Medial Early Sign. These comments are excerpted from an editorial accompanying Dr. Trentham-Dietz’s report (Ann Intern Med. 2016 Aug 22. doi: 10.7326/M16-1791).
The U.S. Preventive Services Task Force made a grade B recommendation for biennial mammography screening in average-risk women aged 50 to 74 years. This current work from the well-regarded Cancer Intervention and Surveillance Modeling Network and Breast Cancer Surveillance Consortium investigators helps women and clinicians to possibly individualize screening frequency based on risk and BI-RADS categories. It will be important to track outcomes in women who undergo alternative screening frequencies to validate this approach.
Christine D. Berg, MD, is in the department of radiation oncology at Johns Hopkins Hospital, Baltimore. She reported receiving personal fees from Medial Early Sign. These comments are excerpted from an editorial accompanying Dr. Trentham-Dietz’s report (Ann Intern Med. 2016 Aug 22. doi: 10.7326/M16-1791).
Breast density is an important factor in determining the appropriate screening intervals for mammography after age 50 years, according to a report published online Aug. 22 in Annals of Internal Medicine.
Researchers from the Cancer Intervention and Surveillance Modeling Network, collaborating with the Breast Cancer Surveillance Consortium, assessed three separate, well-established microsimulation models that used different structures and underlying assumptions but the same data input to estimate the benefits and harms of various screening intervals. They applied the models to two hypothetical populations: Women aged 50 years and older who were initiating screening for the first time and women aged 65 years who had undergone biennial screening since age 50 years.
The models incorporated national data regarding breast cancer incidence, treatment efficacy, and survival. They assessed patient risk by including numerous factors, such as menopausal status, obesity status, age at menarche, nulliparity, and previous biopsy results, but didn’t include family history or genetic testing results. Screening strategies were compared among four possible breast-density levels, according to the American College of Radiology’s Breast Imaging Reporting and Data System (BI-RADS).
The principal finding was that two factors – breast density and risk for breast cancer – were key to determining the optimal screening interval. The optimal interval was the one that would yield the highest number of benefits (breast cancer deaths averted, life-years gained, and quality-adjusted life-years gained) while yielding the lowest number of harms (false-positive mammograms, benign biopsies, and overdiagnosis).
“For average-risk women in low-density subgroups, who comprise a large portion of the population, triennial screening provides a reasonable balance of benefits and harms and is cost effective. Annual screening has a favorable balance of benefits and harms and would be considered cost effective for subgroups of women ... with risk levels that are two to four times the average and with heterogeneously or extremely dense breasts,” the researchers wrote (Ann.Intern Med. 2016 Aug 22. doi: 10.7326/M16-0476).
After age 50 years, annual mammography was more beneficial than harmful only in two subgroups of women: those with greater breast density and those with higher risk for breast cancer. Such women are estimated to comprise less than 1% of the general population at both age 50 years and age 65 years. In contrast, biennial and even triennial mammography yielded fewer false-positives and fewer biopsies for average-risk women with low-density breasts without affecting the number of breast cancer deaths averted, the researchers noted.
The study was supported by grants from the National Institutes of Health and several state public health departments and cancer registries in the United States. The researchers reported receiving grants and other support from the NIH, the American Society of Breast Surgeons, Renaissance Rx, Ally Clinical Diagnostics, the Netherlands National Institute for Public Health and the Environment, SCOR Global Risk Center, and Genomic Health Canada.
Breast density is an important factor in determining the appropriate screening intervals for mammography after age 50 years, according to a report published online Aug. 22 in Annals of Internal Medicine.
Researchers from the Cancer Intervention and Surveillance Modeling Network, collaborating with the Breast Cancer Surveillance Consortium, assessed three separate, well-established microsimulation models that used different structures and underlying assumptions but the same data input to estimate the benefits and harms of various screening intervals. They applied the models to two hypothetical populations: Women aged 50 years and older who were initiating screening for the first time and women aged 65 years who had undergone biennial screening since age 50 years.
The models incorporated national data regarding breast cancer incidence, treatment efficacy, and survival. They assessed patient risk by including numerous factors, such as menopausal status, obesity status, age at menarche, nulliparity, and previous biopsy results, but didn’t include family history or genetic testing results. Screening strategies were compared among four possible breast-density levels, according to the American College of Radiology’s Breast Imaging Reporting and Data System (BI-RADS).
The principal finding was that two factors – breast density and risk for breast cancer – were key to determining the optimal screening interval. The optimal interval was the one that would yield the highest number of benefits (breast cancer deaths averted, life-years gained, and quality-adjusted life-years gained) while yielding the lowest number of harms (false-positive mammograms, benign biopsies, and overdiagnosis).
“For average-risk women in low-density subgroups, who comprise a large portion of the population, triennial screening provides a reasonable balance of benefits and harms and is cost effective. Annual screening has a favorable balance of benefits and harms and would be considered cost effective for subgroups of women ... with risk levels that are two to four times the average and with heterogeneously or extremely dense breasts,” the researchers wrote (Ann.Intern Med. 2016 Aug 22. doi: 10.7326/M16-0476).
After age 50 years, annual mammography was more beneficial than harmful only in two subgroups of women: those with greater breast density and those with higher risk for breast cancer. Such women are estimated to comprise less than 1% of the general population at both age 50 years and age 65 years. In contrast, biennial and even triennial mammography yielded fewer false-positives and fewer biopsies for average-risk women with low-density breasts without affecting the number of breast cancer deaths averted, the researchers noted.
The study was supported by grants from the National Institutes of Health and several state public health departments and cancer registries in the United States. The researchers reported receiving grants and other support from the NIH, the American Society of Breast Surgeons, Renaissance Rx, Ally Clinical Diagnostics, the Netherlands National Institute for Public Health and the Environment, SCOR Global Risk Center, and Genomic Health Canada.
FROM ANNALS OF INTERNAL MEDICINE
Key clinical point: Breast density is a key factor in determining appropriate screening intervals for mammography after age 50.
Major finding: Annual mammography is beneficial only in women with greater breast density and higher risk for breast cancer, who comprise less than 1% of the general population.
Data source: A comparison of three separate microsimulation models for breast cancer screening after age 50 years.
Disclosures: The study was supported by grants from the National Institutes of Health and several state public health departments and cancer registries in the United States. The researchers reported receiving grants and other support from the NIH, the American Society of Breast Surgeons, Renaissance Rx, Ally Clinical Diagnostics, the Netherlands National Institute for Public Health and the Environment, SCOR Global Risk Center, and Genomic Health Canada.