User login
Algorithm for suspected pulmonary embolism safely cut CT rate
ROME – A newly validated, simplified algorithm for the management of patients with suspected acute pulmonary embolism enables physicians to safely exclude the disorder in roughly half of patients without resorting to CT pulmonary angiography, Tom van der Hulle, MD, reported at the annual congress of the European Society of Cardiology.
“This is the largest study ever performed in the diagnostic management of suspected pulmonary embolism. Based on our results, I think the YEARS algorithm is ready to be used in daily clinical practice,” declared Dr. van der Hulle of the department of thrombosis and hemostasis at Leiden (the Netherlands) University Medical Center.
Using the YEARS algorithm, PE was reliably ruled out without need for CT pulmonary angiography – considered the standard in the diagnosis of PE – in 48% of patients. In contrast, adherence to the Wells rule would have meant that 62% of patients would have gotten a CT scan to rule out PE with a comparably high degree of accuracy.
But that 62% figure underestimates the actual CT rate in clinical practice. The reality is that although the guideline-recommended Wells rule and revised Geneva score have been shown to be safe and accurate, they are so complex, cumbersome, and out of sync with the flow of routine clinical practice that many physicians skip the algorithms and go straight to CT, Dr. van der Hulle said. This approach results in many unnecessary CTs, needlessly exposing patients to the risks of radiation and intravenous contrast material while driving up health care costs, he added.
Using the Wells rule or revised Geneva score, the patient evaluation begins with an assessment of the clinical probability of PE based upon a risk score involving seven or eight factors. Only patients with a low or intermediate clinical probability of PE get a D-dimer test; those with a high clinical probability go straight to CT.
The YEARS algorithm is much simpler than that, Dr. van der Hulle explained. Everyone who presents with suspected acute PE gets a D-dimer test while the physician simultaneously applies a brief, three-item clinical prediction rule. These three items were selected by the Dutch investigators because they were the three strongest predictors of PE out of the original seven in the Wells rule. They are hemoptysis, clinical signs of deep vein thrombosis such as leg swelling or hyperpigmentation, and the clinician’s global impression of PE as being the most likely diagnosis.
In the YEARS algorithm, the threshold for a positive D-dimer test warranting CT pulmonary angiography depends upon whether any of the three clinical predictors is present. If none is present, the threshold is 1,000 ng/mL or above; if one or more is present, the threshold for a positive D-dimer test drops to 500 ng/mL.
Using these criteria, PE was excluded without resort to CT in 1,306 patients with none of the three YEARS items and a D-dimer test result below 1,000 ng/mL, as well as in another 327 patients with one or more YEARS items present but a D-dimer below 500 ng/mL. Those two groups were left untreated and followed prospectively for 3 months.
The 964 patients with one or more YEARS predictors present and a D-dimer score of at least 500 ng/mL underwent CT imaging, as did the 352 with no YEARS items and a D-dimer of at least 1,000 ng/mL.
The prevalence of CT-confirmed PE in the study was 13.2%. Affected patients were treated with anticoagulants.
The primary study endpoint was the total rate of deep vein thrombosis during 3 months of follow-up after PE had been excluded. The rate was 0.61%, including a fatal PE rate of 0.20%. The rate in patients managed without CT was 0.43%, including a 0.12% rate of fatal PE. In patients managed with diagnostic CT, the deep vein thrombosis rate was 0.84%, with a fatal PE rate of 0.30%.
“I think these results are completely comparable to those in previous studies using the standard algorithms,” Dr. van der Hulle commented.
The study’s main limitation is that it wasn’t a randomized, controlled trial. But given the tiny event rates, detecting any small differences between management strategies would require an unrealistically huge sample size, he added.
Asked if he thinks physicians will actually use the new tool, Dr. van der Hulle replied that some physicians feel driven to be 100% sure that a patient doesn’t have PE, and they will probably keep overordering CT scans. But others will embrace the YEARS algorithm because it reduces wasted resources and minimizes radiation exposure, a particularly compelling consideration in young female patients.
Discussant Marion Delcroix, MD, had reservations. She said she appreciated the appeal of a simple algorithm, but she asked, “Couldn’t we do better with a bit more sophistication, perhaps by adjusting the D-dimer cutoff for age and also adding some other items, like oxygen saturation and estrogen use?
“My concern is about the applicability. The age of the study cohort is relatively young, at a mean of 53 years. The peak age of PE in a very large contemporary German database is 70-80 years. We don’t know if the YEARS score is any good in this older population,” asserted Dr. Delcroix, professor of medicine and respiratory physiology and head of the center for pulmonary vascular diseases at University Hospital in Leuven, Belgium.
“If the aim is to decrease the number of CT pulmonary angiograms for safety reasons, why not reintroduce compression ultrasound of the lower limbs in the diagnostic algorithm?” she continued. “It has been shown to effectively reduce the need for further imaging.”
Dr. Delcroix predicted that the YEARS algorithm study will prove “too optimistic” regarding the number of CT scans avoided, particularly in elderly patients.
The YEARS study was funded by the trial’s 12 participating Dutch hospitals. Dr. van der Hulle reported having no financial conflicts of interest.
ROME – A newly validated, simplified algorithm for the management of patients with suspected acute pulmonary embolism enables physicians to safely exclude the disorder in roughly half of patients without resorting to CT pulmonary angiography, Tom van der Hulle, MD, reported at the annual congress of the European Society of Cardiology.
“This is the largest study ever performed in the diagnostic management of suspected pulmonary embolism. Based on our results, I think the YEARS algorithm is ready to be used in daily clinical practice,” declared Dr. van der Hulle of the department of thrombosis and hemostasis at Leiden (the Netherlands) University Medical Center.
Using the YEARS algorithm, PE was reliably ruled out without need for CT pulmonary angiography – considered the standard in the diagnosis of PE – in 48% of patients. In contrast, adherence to the Wells rule would have meant that 62% of patients would have gotten a CT scan to rule out PE with a comparably high degree of accuracy.
But that 62% figure underestimates the actual CT rate in clinical practice. The reality is that although the guideline-recommended Wells rule and revised Geneva score have been shown to be safe and accurate, they are so complex, cumbersome, and out of sync with the flow of routine clinical practice that many physicians skip the algorithms and go straight to CT, Dr. van der Hulle said. This approach results in many unnecessary CTs, needlessly exposing patients to the risks of radiation and intravenous contrast material while driving up health care costs, he added.
Using the Wells rule or revised Geneva score, the patient evaluation begins with an assessment of the clinical probability of PE based upon a risk score involving seven or eight factors. Only patients with a low or intermediate clinical probability of PE get a D-dimer test; those with a high clinical probability go straight to CT.
The YEARS algorithm is much simpler than that, Dr. van der Hulle explained. Everyone who presents with suspected acute PE gets a D-dimer test while the physician simultaneously applies a brief, three-item clinical prediction rule. These three items were selected by the Dutch investigators because they were the three strongest predictors of PE out of the original seven in the Wells rule. They are hemoptysis, clinical signs of deep vein thrombosis such as leg swelling or hyperpigmentation, and the clinician’s global impression of PE as being the most likely diagnosis.
In the YEARS algorithm, the threshold for a positive D-dimer test warranting CT pulmonary angiography depends upon whether any of the three clinical predictors is present. If none is present, the threshold is 1,000 ng/mL or above; if one or more is present, the threshold for a positive D-dimer test drops to 500 ng/mL.
Using these criteria, PE was excluded without resort to CT in 1,306 patients with none of the three YEARS items and a D-dimer test result below 1,000 ng/mL, as well as in another 327 patients with one or more YEARS items present but a D-dimer below 500 ng/mL. Those two groups were left untreated and followed prospectively for 3 months.
The 964 patients with one or more YEARS predictors present and a D-dimer score of at least 500 ng/mL underwent CT imaging, as did the 352 with no YEARS items and a D-dimer of at least 1,000 ng/mL.
The prevalence of CT-confirmed PE in the study was 13.2%. Affected patients were treated with anticoagulants.
The primary study endpoint was the total rate of deep vein thrombosis during 3 months of follow-up after PE had been excluded. The rate was 0.61%, including a fatal PE rate of 0.20%. The rate in patients managed without CT was 0.43%, including a 0.12% rate of fatal PE. In patients managed with diagnostic CT, the deep vein thrombosis rate was 0.84%, with a fatal PE rate of 0.30%.
“I think these results are completely comparable to those in previous studies using the standard algorithms,” Dr. van der Hulle commented.
The study’s main limitation is that it wasn’t a randomized, controlled trial. But given the tiny event rates, detecting any small differences between management strategies would require an unrealistically huge sample size, he added.
Asked if he thinks physicians will actually use the new tool, Dr. van der Hulle replied that some physicians feel driven to be 100% sure that a patient doesn’t have PE, and they will probably keep overordering CT scans. But others will embrace the YEARS algorithm because it reduces wasted resources and minimizes radiation exposure, a particularly compelling consideration in young female patients.
Discussant Marion Delcroix, MD, had reservations. She said she appreciated the appeal of a simple algorithm, but she asked, “Couldn’t we do better with a bit more sophistication, perhaps by adjusting the D-dimer cutoff for age and also adding some other items, like oxygen saturation and estrogen use?
“My concern is about the applicability. The age of the study cohort is relatively young, at a mean of 53 years. The peak age of PE in a very large contemporary German database is 70-80 years. We don’t know if the YEARS score is any good in this older population,” asserted Dr. Delcroix, professor of medicine and respiratory physiology and head of the center for pulmonary vascular diseases at University Hospital in Leuven, Belgium.
“If the aim is to decrease the number of CT pulmonary angiograms for safety reasons, why not reintroduce compression ultrasound of the lower limbs in the diagnostic algorithm?” she continued. “It has been shown to effectively reduce the need for further imaging.”
Dr. Delcroix predicted that the YEARS algorithm study will prove “too optimistic” regarding the number of CT scans avoided, particularly in elderly patients.
The YEARS study was funded by the trial’s 12 participating Dutch hospitals. Dr. van der Hulle reported having no financial conflicts of interest.
ROME – A newly validated, simplified algorithm for the management of patients with suspected acute pulmonary embolism enables physicians to safely exclude the disorder in roughly half of patients without resorting to CT pulmonary angiography, Tom van der Hulle, MD, reported at the annual congress of the European Society of Cardiology.
“This is the largest study ever performed in the diagnostic management of suspected pulmonary embolism. Based on our results, I think the YEARS algorithm is ready to be used in daily clinical practice,” declared Dr. van der Hulle of the department of thrombosis and hemostasis at Leiden (the Netherlands) University Medical Center.
Using the YEARS algorithm, PE was reliably ruled out without need for CT pulmonary angiography – considered the standard in the diagnosis of PE – in 48% of patients. In contrast, adherence to the Wells rule would have meant that 62% of patients would have gotten a CT scan to rule out PE with a comparably high degree of accuracy.
But that 62% figure underestimates the actual CT rate in clinical practice. The reality is that although the guideline-recommended Wells rule and revised Geneva score have been shown to be safe and accurate, they are so complex, cumbersome, and out of sync with the flow of routine clinical practice that many physicians skip the algorithms and go straight to CT, Dr. van der Hulle said. This approach results in many unnecessary CTs, needlessly exposing patients to the risks of radiation and intravenous contrast material while driving up health care costs, he added.
Using the Wells rule or revised Geneva score, the patient evaluation begins with an assessment of the clinical probability of PE based upon a risk score involving seven or eight factors. Only patients with a low or intermediate clinical probability of PE get a D-dimer test; those with a high clinical probability go straight to CT.
The YEARS algorithm is much simpler than that, Dr. van der Hulle explained. Everyone who presents with suspected acute PE gets a D-dimer test while the physician simultaneously applies a brief, three-item clinical prediction rule. These three items were selected by the Dutch investigators because they were the three strongest predictors of PE out of the original seven in the Wells rule. They are hemoptysis, clinical signs of deep vein thrombosis such as leg swelling or hyperpigmentation, and the clinician’s global impression of PE as being the most likely diagnosis.
In the YEARS algorithm, the threshold for a positive D-dimer test warranting CT pulmonary angiography depends upon whether any of the three clinical predictors is present. If none is present, the threshold is 1,000 ng/mL or above; if one or more is present, the threshold for a positive D-dimer test drops to 500 ng/mL.
Using these criteria, PE was excluded without resort to CT in 1,306 patients with none of the three YEARS items and a D-dimer test result below 1,000 ng/mL, as well as in another 327 patients with one or more YEARS items present but a D-dimer below 500 ng/mL. Those two groups were left untreated and followed prospectively for 3 months.
The 964 patients with one or more YEARS predictors present and a D-dimer score of at least 500 ng/mL underwent CT imaging, as did the 352 with no YEARS items and a D-dimer of at least 1,000 ng/mL.
The prevalence of CT-confirmed PE in the study was 13.2%. Affected patients were treated with anticoagulants.
The primary study endpoint was the total rate of deep vein thrombosis during 3 months of follow-up after PE had been excluded. The rate was 0.61%, including a fatal PE rate of 0.20%. The rate in patients managed without CT was 0.43%, including a 0.12% rate of fatal PE. In patients managed with diagnostic CT, the deep vein thrombosis rate was 0.84%, with a fatal PE rate of 0.30%.
“I think these results are completely comparable to those in previous studies using the standard algorithms,” Dr. van der Hulle commented.
The study’s main limitation is that it wasn’t a randomized, controlled trial. But given the tiny event rates, detecting any small differences between management strategies would require an unrealistically huge sample size, he added.
Asked if he thinks physicians will actually use the new tool, Dr. van der Hulle replied that some physicians feel driven to be 100% sure that a patient doesn’t have PE, and they will probably keep overordering CT scans. But others will embrace the YEARS algorithm because it reduces wasted resources and minimizes radiation exposure, a particularly compelling consideration in young female patients.
Discussant Marion Delcroix, MD, had reservations. She said she appreciated the appeal of a simple algorithm, but she asked, “Couldn’t we do better with a bit more sophistication, perhaps by adjusting the D-dimer cutoff for age and also adding some other items, like oxygen saturation and estrogen use?
“My concern is about the applicability. The age of the study cohort is relatively young, at a mean of 53 years. The peak age of PE in a very large contemporary German database is 70-80 years. We don’t know if the YEARS score is any good in this older population,” asserted Dr. Delcroix, professor of medicine and respiratory physiology and head of the center for pulmonary vascular diseases at University Hospital in Leuven, Belgium.
“If the aim is to decrease the number of CT pulmonary angiograms for safety reasons, why not reintroduce compression ultrasound of the lower limbs in the diagnostic algorithm?” she continued. “It has been shown to effectively reduce the need for further imaging.”
Dr. Delcroix predicted that the YEARS algorithm study will prove “too optimistic” regarding the number of CT scans avoided, particularly in elderly patients.
The YEARS study was funded by the trial’s 12 participating Dutch hospitals. Dr. van der Hulle reported having no financial conflicts of interest.
Key clinical point:
Major finding: Applying the YEARS algorithm to a large population of patients with suspected PE, the 3-month incidence of deep vein thrombosis after PE had been excluded was 0.61%.
Data source: This was a prospective study of clinical outcomes in nearly 3,000 consecutive Dutch patients who presented with suspected acute PE and were managed in accord with the YEARS algorithm.
Disclosures: The YEARS algorithm validation study was funded by the trial’s 12 participating Dutch hospitals. The study presenter reported having no financial conflicts of interest.
Exploration of Modern Military Research Resources
Advances in medical biotechnologies, data-gathering techniques, and -omics technologies have resulted in the broader understanding of disease pathology and treatment and have facilitated the individualization of health care plans to meet the unique needs of each patient. Military medicine often has been on the forefront of medical technology, disease understanding, and clinical care both on and off the battlefield, in large part due to the unique resources available in the military health care system. These resources allow investigators the ability to integrate vast amounts of epidemiologic data with an extensive biological sample database of its service members, which in the modern age has translated into advances in the understanding of melanoma and the treatment of scars.
History of Research in the Military
Starting in the 1950s, the US Department of Defense (DoD) started to collect serum samples of its service members for the purpose of research.1 It was not until 1985 that the DoD implemented a long-term frozen storage system for serum samples obtained through mandatory screening for human immunodeficiency virus (HIV) in service members.2 Subsequently, the Department of Defense Serum Repository (DoDSR) was officially established in 1989 as a central archive for the long-term storage of serum obtained from active-duty and reserve service members in the US Navy, Army, and Marines.2,3 In the mid-1990s, the DoDSR expanded its capabilities to include the storage of serum samples from all military members, including the US Air Force, obtained predeployment and postdeployment.3,4 At that time, a records-keeping system was established, now known as the Defense Medical Surveillance System (DMSS). The creation of the DMSS provided an extensive epidemiologic database that provided valuable information such as demographic data, service records, deployment data, reportable medical events, exposure history, and vaccination records, which could be linked to the serum samples of each service member.2-4 Since 2008, the responsibilities of maintaining the DoDSR and the DMSS were transferred to the Armed Forces Health Surveillance Center (AFHSC).5
There have been several other databases created over the years that provide additional support and resources to military investigators. The Automated Central Tumor Registry and Department of Pathology and Area Laboratory Services both help investigators to track the incidence of specific cancers in the military population and provide them with pathologic specimens. Additionally, electronic medical records including the composite health care system and the Armed Forces Health Longitudinal Technology Application supplemented with insurance claims data accessible from the Military Health System Management and Reporting Tool (M2) database have made it possible to track patient data.
Utilization of Military Research Resources
Today, the DoDSR is a secure facility that maintains more than 56 million serum specimens from more than 11 million individuals in –30°C freezers, making it one of the largest repositories in the world.3,6 Each serum sample is linked with an individual’s DMSS record, providing a way for investigators to study how external factors such as deployment history, occupation, and exposure history relate to an individual’s unique genetic and physiological makeup. Furthermore, these data can be used for seroepidemiologic investigations that contribute to all facets of clinical care. The AFHSC routinely publishes findings related to notifiable diseases, disease outbreaks, and disease trends in a monthly report.7
There are strict guidelines in place that limit access to the DoDSR and service members’ data. Use of the repository for information directly related to a patient’s health care is one reason for access, such as analyzing serum for antibodies and seroconversion to assist in the diagnosis of a disease such as HIV. Another reason would be to obtain information needed for criminal investigations and prosecution. Typically, these types of requests require a judge-issued court order and approval by the Assistant Secretary of Defense for Health Affairs.4 The DoDSR also is used to study force health protection issues, such as infectious disease incidence and disease prevalence in the military population.
Obtaining access to the DoDSR and service members’ data for research purposes requires that the principal investigator be a DoD employee. Each research proposal is reviewed by members of the AFHSC to determine if the DoDSR is able to meet the demands of the project, including having the appropriate number of serum samples and supporting epidemiologic data available. The AFHSC provides a letter of support if it deems the project to be in line with its current resources and capabilities. Each research proposal is then sent to an institutional review board (IRB) to determine if the study is exempt or needs to go through a full IRB review process. A study might be exempt if the investigators are not obtaining data through interaction with living individuals or not having access to any identifiable protected health information associated with the samples.6 Regardless of whether the study is exempt or not exempt, the AFHSC will de-identify each sample before releasing the samples to the investigators by using a coding system to shield the patient’s identity from the investigator.
Resources within the military medical research system provide investigators with access to an extensive biorepository of serum and linked epidemiological data. Samples from the DoDSR have been used in no less than 75 peer-reviewed publications since 1985.8,9 Several of these studies have been influential in expanding knowledge about conditions seen more commonly in the military population such as stress fractures, traumatic brain injuries, posttraumatic stress disorder, and suicide.8 Additionally, DoDSR samples have been used to form military vaccination policies and track both infectious and noninfectious conditions in the military; for example, during the H1N1 influenza virus outbreak of 2009, AFHSC was essential in helping to limit the spread of the virus within the military community by using its data and collaborating with groups such as the Centers for Disease Control and Prevention to develop a plan for disease surveillance and control.5
Several military research resources are currently being used for a melanoma study that aims to assess if specific phenotypic features, melanoma risk alleles, and environmental factors (eg, duty station location, occupation, amount of UV exposure) can be used to develop better screening models to identify individuals who are at risk for developing melanoma. Secondarily, the study aims to determine if recently developed multimarker diagnostic and prognostic assays for melanoma will prove useful in the diagnostic and prognostic assessment of melanocytic neoplasms in the military population. For this study, one of the authors (J.H.M) is utilizing DoDSR serum from 1700 retrospective cases of invasive melanoma and 1700 matched controls. Additionally, the Automated Central Tumor Registry and Department of Pathology and Area Laboratory Services databases are being used to obtain tissue from more than 300 melanoma cases and nevi controls.
Limitations of the Current System
Despite the impressive capabilities of the current system, there are some issues that limit its potential. One such limitation is associated with the way that the serum samples at the DoDSR are utilized. Through 2012, the DoDSR had 54,542,658 serum specimens available, of which only 228,610 (0.42%) had ever been accessed for study.8 With such a wealth of information and relative availability, why are the serum samples not being accessed more frequently for studies? The inherent nature of the DoDSR being a restricted facility and only accessible to DoD-affiliated investigators may contribute, which allows the DoDSR to fulfill its primary purpose of contributing to military-relevant investigations but at the same time limits the number and type of investigations that can be performed. One idea that has been proposed is allowing civilian investigator access to the DoDSR if it can be proven that the research is targeted toward military-relevant issues.8 However, the current AFHSC access guidelines would need revision and would require additional safeguards to ensure that military-protected health information is not compromised. Nonetheless, such a change may result in more extensive use of DoDSR resources in the future.
An ethical issue that needs to be addressed pertains to how the DoDSR permits use of human serum samples for research purposes without getting consent from the individuals being studied. The serum samples are collected as part of mandatory predeployment and postdeployment examinations for HIV screening of all military members. These individuals are not informed of potential use of their serum specimens for research purposes and no consent forms or opt-out options are provided. Although it is true that military members must comply with specific requirements pertaining to military readiness (eg, receiving appropriate vaccinations, drug testing, regular medical screening), it is debated whether they still retain the right as patients to refuse participating in research and clinical trials.10 The AFHSC does have several regulatory steps in place to ensure that military members’ samples are used in an appropriate manner, including requiring a DoD primary investigator, IRB review of every research proposal, and de-identification of samples. At a minimum, giving military members the ability to provide informed consent would ensure that the military system is adhering to evolving human research standards.
The current lack of biological specimens other than serum in the DoDSR is another limitation of the current system. Recent advances in molecular analyses are impacted by expanding -omics techniques, such as epigenomics, transcriptomics, and proteomics. The field of epigenomics is the study of reversible changes to DNA (eg, methylation) associated with specific disease states or following specific environmental exposures.9,11 Transcriptomics, which analyzes messenger RNA transcript levels of expressed genes, and proteomics, which uses expression of proteins, are 2 techniques being used to develop biomarkers associated with specific diseases and environmental exposures.9,11 Serum alone does not provide the high-quality nucleic acids needed for many of these studies to take place. Adding whole-blood specimens or blood spot samples of military service members to the DoDSR would allow researchers to use these techniques to investigate many new biomarkers associated with military-relevant diseases and exposures. These techniques also can be used in the expanding field of personalized medicine so that health care providers are able to tailor all phases of care, including diagnosis and treatment, to an individual’s genetic profile.
Conclusion
The history of research in military medicine has been built on achieving the primary goal of serving those men and women who put their lives in danger to protect this country. In an evolving environment of new technologies that have led to changes in service members’ injuries, exposures, and diseases, military medicine also must adapt. Resources such as the DoDSR and DMSS, which provide investigators with the unique ability to link epidemiological data with serum samples, have been invaluable contributors to this overall mission. As with any large system, there are always improvements that can be made. Improving access to the DoDSR serum samples, educating and obtaining consent from military service members to use their samples in research, and adding specimens to the DoDSR that can be used for -omics techniques are 3 changes that should be considered to maximize
- Liao SJ. Immunity status of military recruits in 1951 in the United States. I. results of Schick tests. Am J Hyg. 1954;59:262-272.
- Rubertone MV, Brundage JF. The defense medical surveillance system and the department of defense serum repository: glimpses of the future of public health surveillance. Am J Public Health. 2002;92:1900-1904.
- Department of Defense Serum Repository. Military Health System and the Defense Health Agency website. http://www.health.mil/Military-Health-Topics/Health-Readiness/Armed-Forces-Health-Surveillance-Branch/Data-Management-and-Technical-Support/Department-of-Defense-Serum-Repository. Accessed August 2, 2016.
- Perdue CL, Eick-Cost AA, Rubertone MV. A brief description of the operation of the DoD serum repository. Mil Med. 2015;180(10 suppl):10-12.
- DeFraites RF. The Armed Forces Health Surveillance Center: enhancing the Military Health System’s public health capabilities. BMC Public Health. 2011;11(suppl 2):S1.
- Pavlin JA, Welch RA. Ethics, human use, and the department of defense serum repository. Mil Med. 2015;180:49-56.
- Defense Medical Surveillance System. Military Health System and the Defense Health Agency website. http://www.health.mil/Military-Health-Topics/Health-Readiness/Armed-Forces-Health-Surveillance-Branch/Data-Management-and-Technical-Support/Defense-Medical-Surveillance-System. Accessed August 2, 2016.
- Perdue CL, Eick-Cost AA, Rubertone MV, et al. Description and utilization of the United States Department of Defense Serum Repository: a review of published studies, 1985-2012. Plos One. 2015;10:1-16.
- Mancuso JD, Mallon TM, Gaydos JC. Maximizing the capabilities of the DoD serum repository to meet current and future needs: report of the needs panel. Mil Med. 2015;180:14-24.
- Department of Defense. Department of Defense Instruction. http://www.dtic.mil/whs/directives/corres/pdf/600014p.pdf. Posted September 26, 2001. Updated October 3, 2013. Accessed August 2, 2016.
- Lindler LE. Building a DoD biorepository for the future: potential benefits and way forward. Mil Med. 2015;180:90-94.
Advances in medical biotechnologies, data-gathering techniques, and -omics technologies have resulted in the broader understanding of disease pathology and treatment and have facilitated the individualization of health care plans to meet the unique needs of each patient. Military medicine often has been on the forefront of medical technology, disease understanding, and clinical care both on and off the battlefield, in large part due to the unique resources available in the military health care system. These resources allow investigators the ability to integrate vast amounts of epidemiologic data with an extensive biological sample database of its service members, which in the modern age has translated into advances in the understanding of melanoma and the treatment of scars.
History of Research in the Military
Starting in the 1950s, the US Department of Defense (DoD) started to collect serum samples of its service members for the purpose of research.1 It was not until 1985 that the DoD implemented a long-term frozen storage system for serum samples obtained through mandatory screening for human immunodeficiency virus (HIV) in service members.2 Subsequently, the Department of Defense Serum Repository (DoDSR) was officially established in 1989 as a central archive for the long-term storage of serum obtained from active-duty and reserve service members in the US Navy, Army, and Marines.2,3 In the mid-1990s, the DoDSR expanded its capabilities to include the storage of serum samples from all military members, including the US Air Force, obtained predeployment and postdeployment.3,4 At that time, a records-keeping system was established, now known as the Defense Medical Surveillance System (DMSS). The creation of the DMSS provided an extensive epidemiologic database that provided valuable information such as demographic data, service records, deployment data, reportable medical events, exposure history, and vaccination records, which could be linked to the serum samples of each service member.2-4 Since 2008, the responsibilities of maintaining the DoDSR and the DMSS were transferred to the Armed Forces Health Surveillance Center (AFHSC).5
There have been several other databases created over the years that provide additional support and resources to military investigators. The Automated Central Tumor Registry and Department of Pathology and Area Laboratory Services both help investigators to track the incidence of specific cancers in the military population and provide them with pathologic specimens. Additionally, electronic medical records including the composite health care system and the Armed Forces Health Longitudinal Technology Application supplemented with insurance claims data accessible from the Military Health System Management and Reporting Tool (M2) database have made it possible to track patient data.
Utilization of Military Research Resources
Today, the DoDSR is a secure facility that maintains more than 56 million serum specimens from more than 11 million individuals in –30°C freezers, making it one of the largest repositories in the world.3,6 Each serum sample is linked with an individual’s DMSS record, providing a way for investigators to study how external factors such as deployment history, occupation, and exposure history relate to an individual’s unique genetic and physiological makeup. Furthermore, these data can be used for seroepidemiologic investigations that contribute to all facets of clinical care. The AFHSC routinely publishes findings related to notifiable diseases, disease outbreaks, and disease trends in a monthly report.7
There are strict guidelines in place that limit access to the DoDSR and service members’ data. Use of the repository for information directly related to a patient’s health care is one reason for access, such as analyzing serum for antibodies and seroconversion to assist in the diagnosis of a disease such as HIV. Another reason would be to obtain information needed for criminal investigations and prosecution. Typically, these types of requests require a judge-issued court order and approval by the Assistant Secretary of Defense for Health Affairs.4 The DoDSR also is used to study force health protection issues, such as infectious disease incidence and disease prevalence in the military population.
Obtaining access to the DoDSR and service members’ data for research purposes requires that the principal investigator be a DoD employee. Each research proposal is reviewed by members of the AFHSC to determine if the DoDSR is able to meet the demands of the project, including having the appropriate number of serum samples and supporting epidemiologic data available. The AFHSC provides a letter of support if it deems the project to be in line with its current resources and capabilities. Each research proposal is then sent to an institutional review board (IRB) to determine if the study is exempt or needs to go through a full IRB review process. A study might be exempt if the investigators are not obtaining data through interaction with living individuals or not having access to any identifiable protected health information associated with the samples.6 Regardless of whether the study is exempt or not exempt, the AFHSC will de-identify each sample before releasing the samples to the investigators by using a coding system to shield the patient’s identity from the investigator.
Resources within the military medical research system provide investigators with access to an extensive biorepository of serum and linked epidemiological data. Samples from the DoDSR have been used in no less than 75 peer-reviewed publications since 1985.8,9 Several of these studies have been influential in expanding knowledge about conditions seen more commonly in the military population such as stress fractures, traumatic brain injuries, posttraumatic stress disorder, and suicide.8 Additionally, DoDSR samples have been used to form military vaccination policies and track both infectious and noninfectious conditions in the military; for example, during the H1N1 influenza virus outbreak of 2009, AFHSC was essential in helping to limit the spread of the virus within the military community by using its data and collaborating with groups such as the Centers for Disease Control and Prevention to develop a plan for disease surveillance and control.5
Several military research resources are currently being used for a melanoma study that aims to assess if specific phenotypic features, melanoma risk alleles, and environmental factors (eg, duty station location, occupation, amount of UV exposure) can be used to develop better screening models to identify individuals who are at risk for developing melanoma. Secondarily, the study aims to determine if recently developed multimarker diagnostic and prognostic assays for melanoma will prove useful in the diagnostic and prognostic assessment of melanocytic neoplasms in the military population. For this study, one of the authors (J.H.M) is utilizing DoDSR serum from 1700 retrospective cases of invasive melanoma and 1700 matched controls. Additionally, the Automated Central Tumor Registry and Department of Pathology and Area Laboratory Services databases are being used to obtain tissue from more than 300 melanoma cases and nevi controls.
Limitations of the Current System
Despite the impressive capabilities of the current system, there are some issues that limit its potential. One such limitation is associated with the way that the serum samples at the DoDSR are utilized. Through 2012, the DoDSR had 54,542,658 serum specimens available, of which only 228,610 (0.42%) had ever been accessed for study.8 With such a wealth of information and relative availability, why are the serum samples not being accessed more frequently for studies? The inherent nature of the DoDSR being a restricted facility and only accessible to DoD-affiliated investigators may contribute, which allows the DoDSR to fulfill its primary purpose of contributing to military-relevant investigations but at the same time limits the number and type of investigations that can be performed. One idea that has been proposed is allowing civilian investigator access to the DoDSR if it can be proven that the research is targeted toward military-relevant issues.8 However, the current AFHSC access guidelines would need revision and would require additional safeguards to ensure that military-protected health information is not compromised. Nonetheless, such a change may result in more extensive use of DoDSR resources in the future.
An ethical issue that needs to be addressed pertains to how the DoDSR permits use of human serum samples for research purposes without getting consent from the individuals being studied. The serum samples are collected as part of mandatory predeployment and postdeployment examinations for HIV screening of all military members. These individuals are not informed of potential use of their serum specimens for research purposes and no consent forms or opt-out options are provided. Although it is true that military members must comply with specific requirements pertaining to military readiness (eg, receiving appropriate vaccinations, drug testing, regular medical screening), it is debated whether they still retain the right as patients to refuse participating in research and clinical trials.10 The AFHSC does have several regulatory steps in place to ensure that military members’ samples are used in an appropriate manner, including requiring a DoD primary investigator, IRB review of every research proposal, and de-identification of samples. At a minimum, giving military members the ability to provide informed consent would ensure that the military system is adhering to evolving human research standards.
The current lack of biological specimens other than serum in the DoDSR is another limitation of the current system. Recent advances in molecular analyses are impacted by expanding -omics techniques, such as epigenomics, transcriptomics, and proteomics. The field of epigenomics is the study of reversible changes to DNA (eg, methylation) associated with specific disease states or following specific environmental exposures.9,11 Transcriptomics, which analyzes messenger RNA transcript levels of expressed genes, and proteomics, which uses expression of proteins, are 2 techniques being used to develop biomarkers associated with specific diseases and environmental exposures.9,11 Serum alone does not provide the high-quality nucleic acids needed for many of these studies to take place. Adding whole-blood specimens or blood spot samples of military service members to the DoDSR would allow researchers to use these techniques to investigate many new biomarkers associated with military-relevant diseases and exposures. These techniques also can be used in the expanding field of personalized medicine so that health care providers are able to tailor all phases of care, including diagnosis and treatment, to an individual’s genetic profile.
Conclusion
The history of research in military medicine has been built on achieving the primary goal of serving those men and women who put their lives in danger to protect this country. In an evolving environment of new technologies that have led to changes in service members’ injuries, exposures, and diseases, military medicine also must adapt. Resources such as the DoDSR and DMSS, which provide investigators with the unique ability to link epidemiological data with serum samples, have been invaluable contributors to this overall mission. As with any large system, there are always improvements that can be made. Improving access to the DoDSR serum samples, educating and obtaining consent from military service members to use their samples in research, and adding specimens to the DoDSR that can be used for -omics techniques are 3 changes that should be considered to maximize
Advances in medical biotechnologies, data-gathering techniques, and -omics technologies have resulted in the broader understanding of disease pathology and treatment and have facilitated the individualization of health care plans to meet the unique needs of each patient. Military medicine often has been on the forefront of medical technology, disease understanding, and clinical care both on and off the battlefield, in large part due to the unique resources available in the military health care system. These resources allow investigators the ability to integrate vast amounts of epidemiologic data with an extensive biological sample database of its service members, which in the modern age has translated into advances in the understanding of melanoma and the treatment of scars.
History of Research in the Military
Starting in the 1950s, the US Department of Defense (DoD) started to collect serum samples of its service members for the purpose of research.1 It was not until 1985 that the DoD implemented a long-term frozen storage system for serum samples obtained through mandatory screening for human immunodeficiency virus (HIV) in service members.2 Subsequently, the Department of Defense Serum Repository (DoDSR) was officially established in 1989 as a central archive for the long-term storage of serum obtained from active-duty and reserve service members in the US Navy, Army, and Marines.2,3 In the mid-1990s, the DoDSR expanded its capabilities to include the storage of serum samples from all military members, including the US Air Force, obtained predeployment and postdeployment.3,4 At that time, a records-keeping system was established, now known as the Defense Medical Surveillance System (DMSS). The creation of the DMSS provided an extensive epidemiologic database that provided valuable information such as demographic data, service records, deployment data, reportable medical events, exposure history, and vaccination records, which could be linked to the serum samples of each service member.2-4 Since 2008, the responsibilities of maintaining the DoDSR and the DMSS were transferred to the Armed Forces Health Surveillance Center (AFHSC).5
There have been several other databases created over the years that provide additional support and resources to military investigators. The Automated Central Tumor Registry and Department of Pathology and Area Laboratory Services both help investigators to track the incidence of specific cancers in the military population and provide them with pathologic specimens. Additionally, electronic medical records including the composite health care system and the Armed Forces Health Longitudinal Technology Application supplemented with insurance claims data accessible from the Military Health System Management and Reporting Tool (M2) database have made it possible to track patient data.
Utilization of Military Research Resources
Today, the DoDSR is a secure facility that maintains more than 56 million serum specimens from more than 11 million individuals in –30°C freezers, making it one of the largest repositories in the world.3,6 Each serum sample is linked with an individual’s DMSS record, providing a way for investigators to study how external factors such as deployment history, occupation, and exposure history relate to an individual’s unique genetic and physiological makeup. Furthermore, these data can be used for seroepidemiologic investigations that contribute to all facets of clinical care. The AFHSC routinely publishes findings related to notifiable diseases, disease outbreaks, and disease trends in a monthly report.7
There are strict guidelines in place that limit access to the DoDSR and service members’ data. Use of the repository for information directly related to a patient’s health care is one reason for access, such as analyzing serum for antibodies and seroconversion to assist in the diagnosis of a disease such as HIV. Another reason would be to obtain information needed for criminal investigations and prosecution. Typically, these types of requests require a judge-issued court order and approval by the Assistant Secretary of Defense for Health Affairs.4 The DoDSR also is used to study force health protection issues, such as infectious disease incidence and disease prevalence in the military population.
Obtaining access to the DoDSR and service members’ data for research purposes requires that the principal investigator be a DoD employee. Each research proposal is reviewed by members of the AFHSC to determine if the DoDSR is able to meet the demands of the project, including having the appropriate number of serum samples and supporting epidemiologic data available. The AFHSC provides a letter of support if it deems the project to be in line with its current resources and capabilities. Each research proposal is then sent to an institutional review board (IRB) to determine if the study is exempt or needs to go through a full IRB review process. A study might be exempt if the investigators are not obtaining data through interaction with living individuals or not having access to any identifiable protected health information associated with the samples.6 Regardless of whether the study is exempt or not exempt, the AFHSC will de-identify each sample before releasing the samples to the investigators by using a coding system to shield the patient’s identity from the investigator.
Resources within the military medical research system provide investigators with access to an extensive biorepository of serum and linked epidemiological data. Samples from the DoDSR have been used in no less than 75 peer-reviewed publications since 1985.8,9 Several of these studies have been influential in expanding knowledge about conditions seen more commonly in the military population such as stress fractures, traumatic brain injuries, posttraumatic stress disorder, and suicide.8 Additionally, DoDSR samples have been used to form military vaccination policies and track both infectious and noninfectious conditions in the military; for example, during the H1N1 influenza virus outbreak of 2009, AFHSC was essential in helping to limit the spread of the virus within the military community by using its data and collaborating with groups such as the Centers for Disease Control and Prevention to develop a plan for disease surveillance and control.5
Several military research resources are currently being used for a melanoma study that aims to assess if specific phenotypic features, melanoma risk alleles, and environmental factors (eg, duty station location, occupation, amount of UV exposure) can be used to develop better screening models to identify individuals who are at risk for developing melanoma. Secondarily, the study aims to determine if recently developed multimarker diagnostic and prognostic assays for melanoma will prove useful in the diagnostic and prognostic assessment of melanocytic neoplasms in the military population. For this study, one of the authors (J.H.M) is utilizing DoDSR serum from 1700 retrospective cases of invasive melanoma and 1700 matched controls. Additionally, the Automated Central Tumor Registry and Department of Pathology and Area Laboratory Services databases are being used to obtain tissue from more than 300 melanoma cases and nevi controls.
Limitations of the Current System
Despite the impressive capabilities of the current system, there are some issues that limit its potential. One such limitation is associated with the way that the serum samples at the DoDSR are utilized. Through 2012, the DoDSR had 54,542,658 serum specimens available, of which only 228,610 (0.42%) had ever been accessed for study.8 With such a wealth of information and relative availability, why are the serum samples not being accessed more frequently for studies? The inherent nature of the DoDSR being a restricted facility and only accessible to DoD-affiliated investigators may contribute, which allows the DoDSR to fulfill its primary purpose of contributing to military-relevant investigations but at the same time limits the number and type of investigations that can be performed. One idea that has been proposed is allowing civilian investigator access to the DoDSR if it can be proven that the research is targeted toward military-relevant issues.8 However, the current AFHSC access guidelines would need revision and would require additional safeguards to ensure that military-protected health information is not compromised. Nonetheless, such a change may result in more extensive use of DoDSR resources in the future.
An ethical issue that needs to be addressed pertains to how the DoDSR permits use of human serum samples for research purposes without getting consent from the individuals being studied. The serum samples are collected as part of mandatory predeployment and postdeployment examinations for HIV screening of all military members. These individuals are not informed of potential use of their serum specimens for research purposes and no consent forms or opt-out options are provided. Although it is true that military members must comply with specific requirements pertaining to military readiness (eg, receiving appropriate vaccinations, drug testing, regular medical screening), it is debated whether they still retain the right as patients to refuse participating in research and clinical trials.10 The AFHSC does have several regulatory steps in place to ensure that military members’ samples are used in an appropriate manner, including requiring a DoD primary investigator, IRB review of every research proposal, and de-identification of samples. At a minimum, giving military members the ability to provide informed consent would ensure that the military system is adhering to evolving human research standards.
The current lack of biological specimens other than serum in the DoDSR is another limitation of the current system. Recent advances in molecular analyses are impacted by expanding -omics techniques, such as epigenomics, transcriptomics, and proteomics. The field of epigenomics is the study of reversible changes to DNA (eg, methylation) associated with specific disease states or following specific environmental exposures.9,11 Transcriptomics, which analyzes messenger RNA transcript levels of expressed genes, and proteomics, which uses expression of proteins, are 2 techniques being used to develop biomarkers associated with specific diseases and environmental exposures.9,11 Serum alone does not provide the high-quality nucleic acids needed for many of these studies to take place. Adding whole-blood specimens or blood spot samples of military service members to the DoDSR would allow researchers to use these techniques to investigate many new biomarkers associated with military-relevant diseases and exposures. These techniques also can be used in the expanding field of personalized medicine so that health care providers are able to tailor all phases of care, including diagnosis and treatment, to an individual’s genetic profile.
Conclusion
The history of research in military medicine has been built on achieving the primary goal of serving those men and women who put their lives in danger to protect this country. In an evolving environment of new technologies that have led to changes in service members’ injuries, exposures, and diseases, military medicine also must adapt. Resources such as the DoDSR and DMSS, which provide investigators with the unique ability to link epidemiological data with serum samples, have been invaluable contributors to this overall mission. As with any large system, there are always improvements that can be made. Improving access to the DoDSR serum samples, educating and obtaining consent from military service members to use their samples in research, and adding specimens to the DoDSR that can be used for -omics techniques are 3 changes that should be considered to maximize
- Liao SJ. Immunity status of military recruits in 1951 in the United States. I. results of Schick tests. Am J Hyg. 1954;59:262-272.
- Rubertone MV, Brundage JF. The defense medical surveillance system and the department of defense serum repository: glimpses of the future of public health surveillance. Am J Public Health. 2002;92:1900-1904.
- Department of Defense Serum Repository. Military Health System and the Defense Health Agency website. http://www.health.mil/Military-Health-Topics/Health-Readiness/Armed-Forces-Health-Surveillance-Branch/Data-Management-and-Technical-Support/Department-of-Defense-Serum-Repository. Accessed August 2, 2016.
- Perdue CL, Eick-Cost AA, Rubertone MV. A brief description of the operation of the DoD serum repository. Mil Med. 2015;180(10 suppl):10-12.
- DeFraites RF. The Armed Forces Health Surveillance Center: enhancing the Military Health System’s public health capabilities. BMC Public Health. 2011;11(suppl 2):S1.
- Pavlin JA, Welch RA. Ethics, human use, and the department of defense serum repository. Mil Med. 2015;180:49-56.
- Defense Medical Surveillance System. Military Health System and the Defense Health Agency website. http://www.health.mil/Military-Health-Topics/Health-Readiness/Armed-Forces-Health-Surveillance-Branch/Data-Management-and-Technical-Support/Defense-Medical-Surveillance-System. Accessed August 2, 2016.
- Perdue CL, Eick-Cost AA, Rubertone MV, et al. Description and utilization of the United States Department of Defense Serum Repository: a review of published studies, 1985-2012. Plos One. 2015;10:1-16.
- Mancuso JD, Mallon TM, Gaydos JC. Maximizing the capabilities of the DoD serum repository to meet current and future needs: report of the needs panel. Mil Med. 2015;180:14-24.
- Department of Defense. Department of Defense Instruction. http://www.dtic.mil/whs/directives/corres/pdf/600014p.pdf. Posted September 26, 2001. Updated October 3, 2013. Accessed August 2, 2016.
- Lindler LE. Building a DoD biorepository for the future: potential benefits and way forward. Mil Med. 2015;180:90-94.
- Liao SJ. Immunity status of military recruits in 1951 in the United States. I. results of Schick tests. Am J Hyg. 1954;59:262-272.
- Rubertone MV, Brundage JF. The defense medical surveillance system and the department of defense serum repository: glimpses of the future of public health surveillance. Am J Public Health. 2002;92:1900-1904.
- Department of Defense Serum Repository. Military Health System and the Defense Health Agency website. http://www.health.mil/Military-Health-Topics/Health-Readiness/Armed-Forces-Health-Surveillance-Branch/Data-Management-and-Technical-Support/Department-of-Defense-Serum-Repository. Accessed August 2, 2016.
- Perdue CL, Eick-Cost AA, Rubertone MV. A brief description of the operation of the DoD serum repository. Mil Med. 2015;180(10 suppl):10-12.
- DeFraites RF. The Armed Forces Health Surveillance Center: enhancing the Military Health System’s public health capabilities. BMC Public Health. 2011;11(suppl 2):S1.
- Pavlin JA, Welch RA. Ethics, human use, and the department of defense serum repository. Mil Med. 2015;180:49-56.
- Defense Medical Surveillance System. Military Health System and the Defense Health Agency website. http://www.health.mil/Military-Health-Topics/Health-Readiness/Armed-Forces-Health-Surveillance-Branch/Data-Management-and-Technical-Support/Defense-Medical-Surveillance-System. Accessed August 2, 2016.
- Perdue CL, Eick-Cost AA, Rubertone MV, et al. Description and utilization of the United States Department of Defense Serum Repository: a review of published studies, 1985-2012. Plos One. 2015;10:1-16.
- Mancuso JD, Mallon TM, Gaydos JC. Maximizing the capabilities of the DoD serum repository to meet current and future needs: report of the needs panel. Mil Med. 2015;180:14-24.
- Department of Defense. Department of Defense Instruction. http://www.dtic.mil/whs/directives/corres/pdf/600014p.pdf. Posted September 26, 2001. Updated October 3, 2013. Accessed August 2, 2016.
- Lindler LE. Building a DoD biorepository for the future: potential benefits and way forward. Mil Med. 2015;180:90-94.
Practice Points
- Large patient databases and tissue repositories are increasingly being used to improve patient care through the use of clinical data, genomics, proteinomics, and metabolomics.
- The US Military has an established electronic medical record as well as tissue and serum repositories that can be leveraged to study melanoma and other dermatologic diseases.
ATA’s risk assessment guidelines for thyroid nodules using sonography patterns validated
DENVER – The malignancy risk of thyroid nodules can be assessed with reassuring accuracy using ultrasound and the guidelines developed by the American Thyroid Association.
Ultrasound assessment is the first step of the evaluation of any patient with one or more thyroid nodules. “Maybe it shouldn’t be, but, for now, it is,” noted David L. Steward, MD, at the annual meeting of the American Thyroid Association.
The ATA guidelines categorize thyroid nodules on the basis of their ultrasound patterns, with the high risk of malignancy being in nodules that are taller than they are wide and /or have microcalcifications, irregular margins, hypoechoic areas, extrathyroidal extension, interrupted rim calcification with soft tissue extrusion, and suspicious lymph nodes. Between 70% and 90% of thyroids with such patterns will contain malignancy, according to the ATA guidelines. Lesions with an intermediate risk of malignancy have such sonographic findings as hypoechoic solid tissue and regular margins; between 10% and 20% of these are malignant. The third category in the ATA’s guidelines are those that are of low suspicion, with hyperechoic solid tissue, isoechoic solid tissue, partially cystic with eccentric solid area, and regular margins; 5%-10% of these are malignant. Thyroid nodules with a very-low risk of malignancy (less than 3%) are spongiform or partially cystic with no suspicious findings. Finally, benign nodules, of which less than 1% contain malignancy, are cysts, he said.
“We found that the size of the nodule on ultrasound that underwent fine needle aspiration was inversely correlated with malignancy risk: The lower risk nodules were larger,” he said.
Using the ATA’s system, 9 (4%) of the nodules were high risk, 64 (31%) were intermediate risk, 79 (38%) were low risk, 54 (26%) were very-low risk, and none were benign. Five of the nodules were not included in the results presented.
There was good correlation between the Bethesda and ATA classification systems. Of the lesions that were malignant or suspicious for malignancy in the Bethesda system, 77% were very-high risk for malignancy on ultrasound according to the ATA. Of the lesions that were atypia of undetermined significance (AUS)/follicular lesion of undetermined significance (FLUS), 22% were very high risk according to the ATA. Neither of the systems classified as malignant any of the lesions as follicular/Hurthle cell cancer, benign, or nondiagnostic.
The AUS/FLUS nodules “tend to be all over the map,” he noted. Looking at just the AUS/FLUS nodules, malignancy was found on pathology in 100% classified by the ATA system as being high risk; in 21% of those called intermediate risk; in 17% of those called low risk; and in 12% of the very-low risk group.
The study was funded by the University of Cincinnati. Dr. Steward said his only disclosure is that he was a member of the ATA committee that wrote the guidelines under evaluation in this study.
DENVER – The malignancy risk of thyroid nodules can be assessed with reassuring accuracy using ultrasound and the guidelines developed by the American Thyroid Association.
Ultrasound assessment is the first step of the evaluation of any patient with one or more thyroid nodules. “Maybe it shouldn’t be, but, for now, it is,” noted David L. Steward, MD, at the annual meeting of the American Thyroid Association.
The ATA guidelines categorize thyroid nodules on the basis of their ultrasound patterns, with the high risk of malignancy being in nodules that are taller than they are wide and /or have microcalcifications, irregular margins, hypoechoic areas, extrathyroidal extension, interrupted rim calcification with soft tissue extrusion, and suspicious lymph nodes. Between 70% and 90% of thyroids with such patterns will contain malignancy, according to the ATA guidelines. Lesions with an intermediate risk of malignancy have such sonographic findings as hypoechoic solid tissue and regular margins; between 10% and 20% of these are malignant. The third category in the ATA’s guidelines are those that are of low suspicion, with hyperechoic solid tissue, isoechoic solid tissue, partially cystic with eccentric solid area, and regular margins; 5%-10% of these are malignant. Thyroid nodules with a very-low risk of malignancy (less than 3%) are spongiform or partially cystic with no suspicious findings. Finally, benign nodules, of which less than 1% contain malignancy, are cysts, he said.
“We found that the size of the nodule on ultrasound that underwent fine needle aspiration was inversely correlated with malignancy risk: The lower risk nodules were larger,” he said.
Using the ATA’s system, 9 (4%) of the nodules were high risk, 64 (31%) were intermediate risk, 79 (38%) were low risk, 54 (26%) were very-low risk, and none were benign. Five of the nodules were not included in the results presented.
There was good correlation between the Bethesda and ATA classification systems. Of the lesions that were malignant or suspicious for malignancy in the Bethesda system, 77% were very-high risk for malignancy on ultrasound according to the ATA. Of the lesions that were atypia of undetermined significance (AUS)/follicular lesion of undetermined significance (FLUS), 22% were very high risk according to the ATA. Neither of the systems classified as malignant any of the lesions as follicular/Hurthle cell cancer, benign, or nondiagnostic.
The AUS/FLUS nodules “tend to be all over the map,” he noted. Looking at just the AUS/FLUS nodules, malignancy was found on pathology in 100% classified by the ATA system as being high risk; in 21% of those called intermediate risk; in 17% of those called low risk; and in 12% of the very-low risk group.
The study was funded by the University of Cincinnati. Dr. Steward said his only disclosure is that he was a member of the ATA committee that wrote the guidelines under evaluation in this study.
DENVER – The malignancy risk of thyroid nodules can be assessed with reassuring accuracy using ultrasound and the guidelines developed by the American Thyroid Association.
Ultrasound assessment is the first step of the evaluation of any patient with one or more thyroid nodules. “Maybe it shouldn’t be, but, for now, it is,” noted David L. Steward, MD, at the annual meeting of the American Thyroid Association.
The ATA guidelines categorize thyroid nodules on the basis of their ultrasound patterns, with the high risk of malignancy being in nodules that are taller than they are wide and /or have microcalcifications, irregular margins, hypoechoic areas, extrathyroidal extension, interrupted rim calcification with soft tissue extrusion, and suspicious lymph nodes. Between 70% and 90% of thyroids with such patterns will contain malignancy, according to the ATA guidelines. Lesions with an intermediate risk of malignancy have such sonographic findings as hypoechoic solid tissue and regular margins; between 10% and 20% of these are malignant. The third category in the ATA’s guidelines are those that are of low suspicion, with hyperechoic solid tissue, isoechoic solid tissue, partially cystic with eccentric solid area, and regular margins; 5%-10% of these are malignant. Thyroid nodules with a very-low risk of malignancy (less than 3%) are spongiform or partially cystic with no suspicious findings. Finally, benign nodules, of which less than 1% contain malignancy, are cysts, he said.
“We found that the size of the nodule on ultrasound that underwent fine needle aspiration was inversely correlated with malignancy risk: The lower risk nodules were larger,” he said.
Using the ATA’s system, 9 (4%) of the nodules were high risk, 64 (31%) were intermediate risk, 79 (38%) were low risk, 54 (26%) were very-low risk, and none were benign. Five of the nodules were not included in the results presented.
There was good correlation between the Bethesda and ATA classification systems. Of the lesions that were malignant or suspicious for malignancy in the Bethesda system, 77% were very-high risk for malignancy on ultrasound according to the ATA. Of the lesions that were atypia of undetermined significance (AUS)/follicular lesion of undetermined significance (FLUS), 22% were very high risk according to the ATA. Neither of the systems classified as malignant any of the lesions as follicular/Hurthle cell cancer, benign, or nondiagnostic.
The AUS/FLUS nodules “tend to be all over the map,” he noted. Looking at just the AUS/FLUS nodules, malignancy was found on pathology in 100% classified by the ATA system as being high risk; in 21% of those called intermediate risk; in 17% of those called low risk; and in 12% of the very-low risk group.
The study was funded by the University of Cincinnati. Dr. Steward said his only disclosure is that he was a member of the ATA committee that wrote the guidelines under evaluation in this study.
Key clinical point:
Major finding: Of the lesions that were malignant or suspicious for malignancy in the Bethesda system, 77% were very-high risk for malignancy on ultrasound, according to the ATA.
Data source: Prospective validation of the ATA’s ultrasound risk assessment guidelines on 211 thyroid nodules excised from 199 patients.
Disclosures: The study was funded by the University of Cincinnati. Dr. Steward said his only disclosure is that he was a member of the ATA committee that wrote the guidelines under evaluation in this study.
Better GI, urinary function after pelvic radiation with IMRT
BOSTON – For women with cervical or endometrial cancers, postoperative pelvic irradiation with intensity-modulated radiation therapy (IMRT) is associated with a reduction in acute gastrointestinal or genitourinary side effects, better physical functioning, and better quality of life than standard four-field pelvic radiation therapy, investigators contend.
Five weeks after the start of radiation therapy, women treated with pelvic IMRT in a phase III multicenter randomized trial had significantly better bowel and urinary function scores on the Expanded Prostate Index Composite (EPIC) scale, a patient reported–outcomes instrument, said co-principal investigator Ann H. Klopp, MD, from the University of Texas MD Anderson Cancer in Houston.
The trial, nicknamed TIME-C, was specifically designed to determine whether IMRT could reduce acute GI toxicities relative to standard therapy in the 5th week of treatment, after 23 to 25 radiation fractions had been delivered, with urinary toxicities and quality of life measures as secondary endpoints.
Eligible patients were women with pathologically proven diagnoses of endometrial and/or cervical cancer who required postoperative radiation or chemoradiation and had good performance statuses.
Following stratification by dose level (45 or 50.4 Gy), chemotherapy (five cycles of weekly cisplatin 40 mg/m2) or no chemotherapy, and disease site, the patients were randomly assigned to undergo either IMRT (129 patients) or standard 4-field radiation (149) to the pelvis.
Patients were evaluated for symptoms at baseline, 3 and 5 weeks after the start of radiation, and 4-5 weeks after completion, and follow-up is planned for 1 and 3 years after the start of radiation therapy.
EPIC findings
For the primary endpoint of change in the composite of EPIC bowel function and bother score from baseline, patients in both arms had declines in scores, signaling increased symptoms, but the decline was significantly greater among patients treated with four-field radiation (mean 23.6 point decline) vs. IMRT (mean 18.6 point decline, P = .048). Viewed separately, bowel function but not bowel bother scores were significantly lower in the standard radiation group. By 4 to 6 weeks after therapy, however, scores in both groups had recovered to baseline levels, Dr. Klopp noted.
Similarly, bowel-related scores on the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events measure (PRO-CTCAE), a secondary endpoint, were significantly worse among patients who underwent standard radiation for the domains of diarrhea, fecal frequency, and fecal interference. There were no significant differences between the groups in abdominal pain measures, however.
For the secondary endpoint of EPIC urinary scores, IMRT was also associated with lower toxicities, with a mean urinary summary score decline of 5.6 points, compared with a 10.4-point drop among patients treated with standard four-field radiation (P = .03)
Finally, patients on IMRT had a smaller comparative decline in the physical wellbeing scale of the Functional Assessment of Cancer Therapy cervical cancer scale (P = .03).
The results support what many radiation oncologists believe but have not been able to prove until now, commented Geraldine Jacobson, MD, MPH, professor and chair of radiation oncology at West Virginia University, Morgantown.
But Supriya Chopra, MD, a radiation oncologist at the Tata Memorial Centre in Mumbai, India, the invited discussant at the plenary, said that the evidence in favor of IMRT is not so clear.
At the 2015 ASTRO annual meeting, Dr. Chopra and colleagues presented interim results of the PARCER study, in which grade III or higher radiation-induced bowel toxicity was lower with IMRT than with 3D-conformal radiation therapy. However, the 14.6% absolute difference, while significant (P = .02) was an exploratory endpoint only, and the observed differences in grade II or greater toxicities was not significant.
Differences in results between PARCER and TIME-C may be explained by the fact that patients in the PARCER trial had a higher proportion of concurrent cisplatin-based chemotherapy (about 88%) than patients in TIME-C (about 75%), with the excess cisplatin in the former trial possibly contributing to a worse symptom burden, she suggested.
“Pooled data from both trials is needed to assess the impact of IMRT for at least physician-reported acute GI toxicity, which both trials have captured. Long-term data from TIME-C and the final analysis of PARCER is awaited to assess the impact of late GI toxicity, and in my opinion, postoperative IMRT for gynecological cancers continues to be investigational,” she said.
BOSTON – For women with cervical or endometrial cancers, postoperative pelvic irradiation with intensity-modulated radiation therapy (IMRT) is associated with a reduction in acute gastrointestinal or genitourinary side effects, better physical functioning, and better quality of life than standard four-field pelvic radiation therapy, investigators contend.
Five weeks after the start of radiation therapy, women treated with pelvic IMRT in a phase III multicenter randomized trial had significantly better bowel and urinary function scores on the Expanded Prostate Index Composite (EPIC) scale, a patient reported–outcomes instrument, said co-principal investigator Ann H. Klopp, MD, from the University of Texas MD Anderson Cancer in Houston.
The trial, nicknamed TIME-C, was specifically designed to determine whether IMRT could reduce acute GI toxicities relative to standard therapy in the 5th week of treatment, after 23 to 25 radiation fractions had been delivered, with urinary toxicities and quality of life measures as secondary endpoints.
Eligible patients were women with pathologically proven diagnoses of endometrial and/or cervical cancer who required postoperative radiation or chemoradiation and had good performance statuses.
Following stratification by dose level (45 or 50.4 Gy), chemotherapy (five cycles of weekly cisplatin 40 mg/m2) or no chemotherapy, and disease site, the patients were randomly assigned to undergo either IMRT (129 patients) or standard 4-field radiation (149) to the pelvis.
Patients were evaluated for symptoms at baseline, 3 and 5 weeks after the start of radiation, and 4-5 weeks after completion, and follow-up is planned for 1 and 3 years after the start of radiation therapy.
EPIC findings
For the primary endpoint of change in the composite of EPIC bowel function and bother score from baseline, patients in both arms had declines in scores, signaling increased symptoms, but the decline was significantly greater among patients treated with four-field radiation (mean 23.6 point decline) vs. IMRT (mean 18.6 point decline, P = .048). Viewed separately, bowel function but not bowel bother scores were significantly lower in the standard radiation group. By 4 to 6 weeks after therapy, however, scores in both groups had recovered to baseline levels, Dr. Klopp noted.
Similarly, bowel-related scores on the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events measure (PRO-CTCAE), a secondary endpoint, were significantly worse among patients who underwent standard radiation for the domains of diarrhea, fecal frequency, and fecal interference. There were no significant differences between the groups in abdominal pain measures, however.
For the secondary endpoint of EPIC urinary scores, IMRT was also associated with lower toxicities, with a mean urinary summary score decline of 5.6 points, compared with a 10.4-point drop among patients treated with standard four-field radiation (P = .03)
Finally, patients on IMRT had a smaller comparative decline in the physical wellbeing scale of the Functional Assessment of Cancer Therapy cervical cancer scale (P = .03).
The results support what many radiation oncologists believe but have not been able to prove until now, commented Geraldine Jacobson, MD, MPH, professor and chair of radiation oncology at West Virginia University, Morgantown.
But Supriya Chopra, MD, a radiation oncologist at the Tata Memorial Centre in Mumbai, India, the invited discussant at the plenary, said that the evidence in favor of IMRT is not so clear.
At the 2015 ASTRO annual meeting, Dr. Chopra and colleagues presented interim results of the PARCER study, in which grade III or higher radiation-induced bowel toxicity was lower with IMRT than with 3D-conformal radiation therapy. However, the 14.6% absolute difference, while significant (P = .02) was an exploratory endpoint only, and the observed differences in grade II or greater toxicities was not significant.
Differences in results between PARCER and TIME-C may be explained by the fact that patients in the PARCER trial had a higher proportion of concurrent cisplatin-based chemotherapy (about 88%) than patients in TIME-C (about 75%), with the excess cisplatin in the former trial possibly contributing to a worse symptom burden, she suggested.
“Pooled data from both trials is needed to assess the impact of IMRT for at least physician-reported acute GI toxicity, which both trials have captured. Long-term data from TIME-C and the final analysis of PARCER is awaited to assess the impact of late GI toxicity, and in my opinion, postoperative IMRT for gynecological cancers continues to be investigational,” she said.
BOSTON – For women with cervical or endometrial cancers, postoperative pelvic irradiation with intensity-modulated radiation therapy (IMRT) is associated with a reduction in acute gastrointestinal or genitourinary side effects, better physical functioning, and better quality of life than standard four-field pelvic radiation therapy, investigators contend.
Five weeks after the start of radiation therapy, women treated with pelvic IMRT in a phase III multicenter randomized trial had significantly better bowel and urinary function scores on the Expanded Prostate Index Composite (EPIC) scale, a patient reported–outcomes instrument, said co-principal investigator Ann H. Klopp, MD, from the University of Texas MD Anderson Cancer in Houston.
The trial, nicknamed TIME-C, was specifically designed to determine whether IMRT could reduce acute GI toxicities relative to standard therapy in the 5th week of treatment, after 23 to 25 radiation fractions had been delivered, with urinary toxicities and quality of life measures as secondary endpoints.
Eligible patients were women with pathologically proven diagnoses of endometrial and/or cervical cancer who required postoperative radiation or chemoradiation and had good performance statuses.
Following stratification by dose level (45 or 50.4 Gy), chemotherapy (five cycles of weekly cisplatin 40 mg/m2) or no chemotherapy, and disease site, the patients were randomly assigned to undergo either IMRT (129 patients) or standard 4-field radiation (149) to the pelvis.
Patients were evaluated for symptoms at baseline, 3 and 5 weeks after the start of radiation, and 4-5 weeks after completion, and follow-up is planned for 1 and 3 years after the start of radiation therapy.
EPIC findings
For the primary endpoint of change in the composite of EPIC bowel function and bother score from baseline, patients in both arms had declines in scores, signaling increased symptoms, but the decline was significantly greater among patients treated with four-field radiation (mean 23.6 point decline) vs. IMRT (mean 18.6 point decline, P = .048). Viewed separately, bowel function but not bowel bother scores were significantly lower in the standard radiation group. By 4 to 6 weeks after therapy, however, scores in both groups had recovered to baseline levels, Dr. Klopp noted.
Similarly, bowel-related scores on the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events measure (PRO-CTCAE), a secondary endpoint, were significantly worse among patients who underwent standard radiation for the domains of diarrhea, fecal frequency, and fecal interference. There were no significant differences between the groups in abdominal pain measures, however.
For the secondary endpoint of EPIC urinary scores, IMRT was also associated with lower toxicities, with a mean urinary summary score decline of 5.6 points, compared with a 10.4-point drop among patients treated with standard four-field radiation (P = .03)
Finally, patients on IMRT had a smaller comparative decline in the physical wellbeing scale of the Functional Assessment of Cancer Therapy cervical cancer scale (P = .03).
The results support what many radiation oncologists believe but have not been able to prove until now, commented Geraldine Jacobson, MD, MPH, professor and chair of radiation oncology at West Virginia University, Morgantown.
But Supriya Chopra, MD, a radiation oncologist at the Tata Memorial Centre in Mumbai, India, the invited discussant at the plenary, said that the evidence in favor of IMRT is not so clear.
At the 2015 ASTRO annual meeting, Dr. Chopra and colleagues presented interim results of the PARCER study, in which grade III or higher radiation-induced bowel toxicity was lower with IMRT than with 3D-conformal radiation therapy. However, the 14.6% absolute difference, while significant (P = .02) was an exploratory endpoint only, and the observed differences in grade II or greater toxicities was not significant.
Differences in results between PARCER and TIME-C may be explained by the fact that patients in the PARCER trial had a higher proportion of concurrent cisplatin-based chemotherapy (about 88%) than patients in TIME-C (about 75%), with the excess cisplatin in the former trial possibly contributing to a worse symptom burden, she suggested.
“Pooled data from both trials is needed to assess the impact of IMRT for at least physician-reported acute GI toxicity, which both trials have captured. Long-term data from TIME-C and the final analysis of PARCER is awaited to assess the impact of late GI toxicity, and in my opinion, postoperative IMRT for gynecological cancers continues to be investigational,” she said.
Key clinical point: Pelvic irradiation with intensity-modulated radiation therapy was associated with lower acute bowel toxicity than standard radiation.
Major finding: The decline in EPIC bowel summary scores was smaller with IMRT than with four-field pelvic irradiation.
Data source: Randomized phase III trial in 278 patients with cervical or endometrial cancers.
Disclosures: TIME-C was supported by the National Cancer Institute. Dr. Klopp, Dr. Jacobson, and Dr. Chopra reported no relevant conflicts of interest.
Robotic surgery boasts fewer postoperative complications in radical hysterectomy
BOSTON – Robot-assisted radical hysterectomy is just as safe, or perhaps safer, than open surgery, according to a new study that examined perioperative and postoperative outcomes with long-term follow-ups for both types of procedures.
“Robotic surgery has been expanding for the last 20 years, but still the recurrence rate with cancer patients is missing data because very few studies are published; they don’t have long-term oncologic outcomes, and if [the technology] works properly we have to put it into the literature,” M. Bilal Sert, MD, of Oslo University, said at the annual Minimally Invasive Surgery Week.
Dr. Sert and his coinvestigators identified 215 women who underwent either open or robot-assisted radical hysterectomy between November 2005 and December 2012. All of the procedures were elective and the robot-assisted operations were performed using the da Vinci robotic surgical platform. After excluding neoadjuvant cases, which totaled 19, the researchers looked at data on 196 patients (122 open radical hysterectomy cases and 74 robot-assisted radical hysterectomy cases).
On average, operating time for open radical hysterectomy was 171 minutes, versus 263 minutes for robot-assisted radical hysterectomy. However, the robotic surgery arm had lower mean estimated blood loss than the open surgery cohort: 80 milliliters versus 468 milliliters, respectively (P = .003). Follow-up time frames were shorter in the robotic surgery cohort by 6 months: 46 months reported for robotic surgery, compared with a 52-month average experienced by those in the open surgery cohort.
Both groups experienced recurrences, including 12 patients in the open surgery cohort (9.8%) and 9 patients in the robotic surgery cohort (12.1%) (P = .3), indicating a statistically insignificant difference. Similarly, rates of perioperative complications were 8% for open surgery and 11% for robotic surgery (P = .3), which was not significantly different.
However, rates of postoperative complications were 36% for open surgery and 12% for robotic surgery (P = .001), which was statistically significant.
“Based on our data, I can say that [robot-assisted radical hysterectomy] is safe, and in fact I prefer to use the robot,” Dr. Sert said at the meeting, which was held by the Society of Laparoendoscopic Surgeons. “Of course, robot-assisted surgery will not automatically make you a better surgeon, but on more complicated radical hysterectomy patients, it will help make the surgeon more precise.”
No funding source was disclosed for this study. Dr. Sert reported having no relevant financial disclosures.
BOSTON – Robot-assisted radical hysterectomy is just as safe, or perhaps safer, than open surgery, according to a new study that examined perioperative and postoperative outcomes with long-term follow-ups for both types of procedures.
“Robotic surgery has been expanding for the last 20 years, but still the recurrence rate with cancer patients is missing data because very few studies are published; they don’t have long-term oncologic outcomes, and if [the technology] works properly we have to put it into the literature,” M. Bilal Sert, MD, of Oslo University, said at the annual Minimally Invasive Surgery Week.
Dr. Sert and his coinvestigators identified 215 women who underwent either open or robot-assisted radical hysterectomy between November 2005 and December 2012. All of the procedures were elective and the robot-assisted operations were performed using the da Vinci robotic surgical platform. After excluding neoadjuvant cases, which totaled 19, the researchers looked at data on 196 patients (122 open radical hysterectomy cases and 74 robot-assisted radical hysterectomy cases).
On average, operating time for open radical hysterectomy was 171 minutes, versus 263 minutes for robot-assisted radical hysterectomy. However, the robotic surgery arm had lower mean estimated blood loss than the open surgery cohort: 80 milliliters versus 468 milliliters, respectively (P = .003). Follow-up time frames were shorter in the robotic surgery cohort by 6 months: 46 months reported for robotic surgery, compared with a 52-month average experienced by those in the open surgery cohort.
Both groups experienced recurrences, including 12 patients in the open surgery cohort (9.8%) and 9 patients in the robotic surgery cohort (12.1%) (P = .3), indicating a statistically insignificant difference. Similarly, rates of perioperative complications were 8% for open surgery and 11% for robotic surgery (P = .3), which was not significantly different.
However, rates of postoperative complications were 36% for open surgery and 12% for robotic surgery (P = .001), which was statistically significant.
“Based on our data, I can say that [robot-assisted radical hysterectomy] is safe, and in fact I prefer to use the robot,” Dr. Sert said at the meeting, which was held by the Society of Laparoendoscopic Surgeons. “Of course, robot-assisted surgery will not automatically make you a better surgeon, but on more complicated radical hysterectomy patients, it will help make the surgeon more precise.”
No funding source was disclosed for this study. Dr. Sert reported having no relevant financial disclosures.
BOSTON – Robot-assisted radical hysterectomy is just as safe, or perhaps safer, than open surgery, according to a new study that examined perioperative and postoperative outcomes with long-term follow-ups for both types of procedures.
“Robotic surgery has been expanding for the last 20 years, but still the recurrence rate with cancer patients is missing data because very few studies are published; they don’t have long-term oncologic outcomes, and if [the technology] works properly we have to put it into the literature,” M. Bilal Sert, MD, of Oslo University, said at the annual Minimally Invasive Surgery Week.
Dr. Sert and his coinvestigators identified 215 women who underwent either open or robot-assisted radical hysterectomy between November 2005 and December 2012. All of the procedures were elective and the robot-assisted operations were performed using the da Vinci robotic surgical platform. After excluding neoadjuvant cases, which totaled 19, the researchers looked at data on 196 patients (122 open radical hysterectomy cases and 74 robot-assisted radical hysterectomy cases).
On average, operating time for open radical hysterectomy was 171 minutes, versus 263 minutes for robot-assisted radical hysterectomy. However, the robotic surgery arm had lower mean estimated blood loss than the open surgery cohort: 80 milliliters versus 468 milliliters, respectively (P = .003). Follow-up time frames were shorter in the robotic surgery cohort by 6 months: 46 months reported for robotic surgery, compared with a 52-month average experienced by those in the open surgery cohort.
Both groups experienced recurrences, including 12 patients in the open surgery cohort (9.8%) and 9 patients in the robotic surgery cohort (12.1%) (P = .3), indicating a statistically insignificant difference. Similarly, rates of perioperative complications were 8% for open surgery and 11% for robotic surgery (P = .3), which was not significantly different.
However, rates of postoperative complications were 36% for open surgery and 12% for robotic surgery (P = .001), which was statistically significant.
“Based on our data, I can say that [robot-assisted radical hysterectomy] is safe, and in fact I prefer to use the robot,” Dr. Sert said at the meeting, which was held by the Society of Laparoendoscopic Surgeons. “Of course, robot-assisted surgery will not automatically make you a better surgeon, but on more complicated radical hysterectomy patients, it will help make the surgeon more precise.”
No funding source was disclosed for this study. Dr. Sert reported having no relevant financial disclosures.
Key clinical point:
Major finding: Postoperative complications were 36% for patients who underwent open radical hysterectomy, compared with 12% for those undergoing robot-assisted radical hysterectomy (P = .001).
Data source: Retrospective review of data on 215 patients who underwent open or robot-assisted radical hysterectomy between November 2005 and December 2012.
Disclosures: Dr. Sert reported having no relevant financial disclosures.
The Proposed Rule and Payments for 2017: The Good, the Bad, and the Ugly
Just as Charlie Brown looks forward to the coming of the Great Pumpkin each Halloween, those of us who dance in the minefields of payment policy await the publication of the Proposed Rule, more formally known as the “Revisions to Payment Policies under the Physician Fee Schedule and Other Revisions to Part B for CY 2017.”1,2 You could read the entire tome—a mere 316 pages (excluding the hundreds of pages of granular supplement data discussed in the last few columns)—or simply read what I have outlined as the good, the bad, and the ugly for the Proposed Rule for 2017.
The Good
In 2017, dermatology will increase its share of the pie by 1% to $3.505 billion of a total $89.467 billion expected to be expended for physician services.1 The effect on individual providers will vary by geographic location and practice mix. Half is from the 0.5% increase that has come to all physicians across the board as mandated by the Medicare Access and CHIP Reauthorization Act (MACRA).3
Current Procedural Terminology (CPT) codes for reflectance confocal microscopy (96931–96936) will have Centers for Medicare & Medicaid Services valuations beginning in 2017, and individuals performing this service should be able to report it and be paid for their efforts.1 The values are below the American Medical Association/Specialty Society Relative Value Scale Update Committee (RUC) recommendations.
The Bad
Payment rates for 2017 will be based on a conversion factor of 35.7751,1 a drop from the 2016 conversion factor of 35.8043. Cuts will be made for some specialties. Gastroenterology, nephrology, neurosurgery, radiology, urology, and radiation therapy centers will take a 1% hit; ophthalmology, pathology, and vascular surgery will take 2% cuts; and interventional radiology will lose 7%.1 A special case within dermatology and pathology is a 15% cut to the technical component of slide preparation for CPT code 883054 due to a redefinition of the valuation of eosin stains.2 While the accuracy and precision of the value of these practice expense inputs can be debated, the government by definition makes the rules and involved specialties had an opportunity to appeal this change through the comment process that ended on September 6, 2016. The government can take comments into account, but substantial changes usually are not made from the Proposed Rule to the Final Rule, which usually arrives around the beginning of November; however, in an election year, the Final Rule can be a few weeks late.
The Ugly
The government will increase its unfunded mandates with the creation of new Medicare G codes (global services codes) that will allow the government to track the provision of postoperative care for all 010 and 090 global service periods (Table 1). The codes look mostly at time and do not clearly take into account the severity or complexity of the conditions being cared for and will be reported on claim forms as an unfunded mandate with more confusion and cost.1 Because not all claim-paying intermediaries are likely to have these G codes smoothly set up in their systems, there will still be a cost to filing the claim. Unless changes occur in the Final Rule, which is unlikely, there will be no payment for the time and effort of submitting these claims. The goal of the US Government is to hone in on postoperative services and parse them down so they can cut payments wherever possible beginning in 2019.1 Everyone wants to save money, from the consumer5 to the payer, and the ultimate payer is playing hardball. Additional validation efforts likely will lower physician fee-for-service payments further.
The US Government also is taking a shot at what they call “misvalued services” that have not had recent refinement within the RUC process.1 The work list for 2017 includes a number of 000 global period codes where additional evaluation and management services are reported using modifier -25, which implies a substantial, separately identifiable cognitive service performed by the same physician on the day of a procedure above and beyond other services provided or beyond the usual preservice and postservice care associated with the procedure that was performed. Although codes such as biopsies (11100 and 11101) and premalignant destructions (17000–17004) have an adjustment built in and dermatologists who provide services on the same day are actually penalized for the multiple built-in reductions that are already additive, the government is concerned that 19% of the 000 global services were billed more than 50% of the time with an evaluation and management code with modifier -25. Eighty-three codes met the criteria for which the government believes it may be overpaying1; the codes of interest to dermatology are shown in Table 2.1
The refinement of global periods will be an ongoing exercise through 2017, and beyond, with results likely to play an important role in the 2019 fee schedule. These global period reviews combined with some Stark law refinement relating the leasing of space at market rates while disallowing the landlord physician from receiving patient referrals from the tenant may also affect practitioner income.1,6 I never cease to be amazed that former Congressman Fortney Hillman “Pete” Stark (D), who has an antikickback scheme that keeps expanding, never went after the banking and brokerage industries. The founder of the $1.1 billion Security National Bank, a small bank in Walnut Creek, California,7 never focused on regulating banks. In his 40-year congressional career, he decided physicians make better targets. His efforts have not helped physicians but have helped lawyers, as he is quick to acknowledge.8
Final Thoughts
I end this column with an appeal to the dermatologists of America. Go to the American Academy of Dermatology Association Political Action Committee website (https://skinpac.org/), the home page for the only political action committee that represents the dermatology specialty, and consider making a donation. Emergency medicine physicians created the “Giving a Shift” campaign, which is a donation to their national political action committee of one shift’s earnings, and most of us could easily donate a half day’s income, as the only way to potentially change the increasingly onerous burdens on practitioners is through political action. As we say at RUC meetings, you can eat lunch or be lunch. The choice is yours.
- Medicare Program; Revisions to Payment Policies Under the Physician Fee Schedule and Other Revisions to Part B for CY 2017; Medicare Advantage Pricing Data Release; Medicare Advantage and Part D Medical Low Ratio Data Release; Medicare Advantage Provider Network Requirements; Expansion of Medicare Diabetes Prevention Program Model. Fed Regist. 2016;81(136):46162-46476. To be codified at 42 CFR §405, 410, 411, et al. https://www.gpo.gov/fdsys/pkg/FR-2016-07-15/pdf/2016-16097.pdf. Accessed September 7, 2016.
- Revisions to Payment Policies under the Physician Fee Schedule and Other Revisions to Part B for CY 2017. Centers for Medicare & Medicaid Services website. https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/PhysicianFeeSched/PFS-Federal-Regulation-Notices-Items/CMS-1654-P.html. Accessed September 7, 2016.
- Text of the Medicare Access and CHIP Reauthorization Act of 2015. GovTrack website. https://www.govtrack.us/congress/bills/114/hr2/text. Accessed September 9, 2016.
- Kaplan KJ. Proposed Medicare 2017 reimbursement schedule whacks biopsy payments; digital pathology payments up. Digital Pathology Blog website. http://tissuepathology.com/2016/07/20/proposed-medicare-2017-reimbursement-schedule-whacks-biopsy-payments-digital-pathology-payments-up/#ixzz4HEqBLgzu. Published July 20, 2016. Accessed September 7, 2016.
- Abelson R. Cost, not choice, is top concern of health insurance customers. The New York Times. http://www.nytimes.com/2016/08/13/business/cost-not-choice-is-top-concern-of-health-insurance-customers.html?_r=0. Published August 12, 2016. Accessed September 7, 2016.
- Stark Law website. http://starklaw.org/. Accessed September 7, 2016.
- Pete Stark. Freedom From Religion website. https://ffrf.org/news/day/dayitems/item/14800-pete-stark. Accessed September 19, 2016.
- Adamy J. Pete Stark: Law regulating doctors mostly helped lawyers. The Wall Street Journal. October 22, 2014. http://blogs.wsj.com/washwire/2014/10/22/pete-stark-law-regulating-doctors-mostly-helped-lawyers/. Accessed September 19, 2016.
Just as Charlie Brown looks forward to the coming of the Great Pumpkin each Halloween, those of us who dance in the minefields of payment policy await the publication of the Proposed Rule, more formally known as the “Revisions to Payment Policies under the Physician Fee Schedule and Other Revisions to Part B for CY 2017.”1,2 You could read the entire tome—a mere 316 pages (excluding the hundreds of pages of granular supplement data discussed in the last few columns)—or simply read what I have outlined as the good, the bad, and the ugly for the Proposed Rule for 2017.
The Good
In 2017, dermatology will increase its share of the pie by 1% to $3.505 billion of a total $89.467 billion expected to be expended for physician services.1 The effect on individual providers will vary by geographic location and practice mix. Half is from the 0.5% increase that has come to all physicians across the board as mandated by the Medicare Access and CHIP Reauthorization Act (MACRA).3
Current Procedural Terminology (CPT) codes for reflectance confocal microscopy (96931–96936) will have Centers for Medicare & Medicaid Services valuations beginning in 2017, and individuals performing this service should be able to report it and be paid for their efforts.1 The values are below the American Medical Association/Specialty Society Relative Value Scale Update Committee (RUC) recommendations.
The Bad
Payment rates for 2017 will be based on a conversion factor of 35.7751,1 a drop from the 2016 conversion factor of 35.8043. Cuts will be made for some specialties. Gastroenterology, nephrology, neurosurgery, radiology, urology, and radiation therapy centers will take a 1% hit; ophthalmology, pathology, and vascular surgery will take 2% cuts; and interventional radiology will lose 7%.1 A special case within dermatology and pathology is a 15% cut to the technical component of slide preparation for CPT code 883054 due to a redefinition of the valuation of eosin stains.2 While the accuracy and precision of the value of these practice expense inputs can be debated, the government by definition makes the rules and involved specialties had an opportunity to appeal this change through the comment process that ended on September 6, 2016. The government can take comments into account, but substantial changes usually are not made from the Proposed Rule to the Final Rule, which usually arrives around the beginning of November; however, in an election year, the Final Rule can be a few weeks late.
The Ugly
The government will increase its unfunded mandates with the creation of new Medicare G codes (global services codes) that will allow the government to track the provision of postoperative care for all 010 and 090 global service periods (Table 1). The codes look mostly at time and do not clearly take into account the severity or complexity of the conditions being cared for and will be reported on claim forms as an unfunded mandate with more confusion and cost.1 Because not all claim-paying intermediaries are likely to have these G codes smoothly set up in their systems, there will still be a cost to filing the claim. Unless changes occur in the Final Rule, which is unlikely, there will be no payment for the time and effort of submitting these claims. The goal of the US Government is to hone in on postoperative services and parse them down so they can cut payments wherever possible beginning in 2019.1 Everyone wants to save money, from the consumer5 to the payer, and the ultimate payer is playing hardball. Additional validation efforts likely will lower physician fee-for-service payments further.
The US Government also is taking a shot at what they call “misvalued services” that have not had recent refinement within the RUC process.1 The work list for 2017 includes a number of 000 global period codes where additional evaluation and management services are reported using modifier -25, which implies a substantial, separately identifiable cognitive service performed by the same physician on the day of a procedure above and beyond other services provided or beyond the usual preservice and postservice care associated with the procedure that was performed. Although codes such as biopsies (11100 and 11101) and premalignant destructions (17000–17004) have an adjustment built in and dermatologists who provide services on the same day are actually penalized for the multiple built-in reductions that are already additive, the government is concerned that 19% of the 000 global services were billed more than 50% of the time with an evaluation and management code with modifier -25. Eighty-three codes met the criteria for which the government believes it may be overpaying1; the codes of interest to dermatology are shown in Table 2.1
The refinement of global periods will be an ongoing exercise through 2017, and beyond, with results likely to play an important role in the 2019 fee schedule. These global period reviews combined with some Stark law refinement relating the leasing of space at market rates while disallowing the landlord physician from receiving patient referrals from the tenant may also affect practitioner income.1,6 I never cease to be amazed that former Congressman Fortney Hillman “Pete” Stark (D), who has an antikickback scheme that keeps expanding, never went after the banking and brokerage industries. The founder of the $1.1 billion Security National Bank, a small bank in Walnut Creek, California,7 never focused on regulating banks. In his 40-year congressional career, he decided physicians make better targets. His efforts have not helped physicians but have helped lawyers, as he is quick to acknowledge.8
Final Thoughts
I end this column with an appeal to the dermatologists of America. Go to the American Academy of Dermatology Association Political Action Committee website (https://skinpac.org/), the home page for the only political action committee that represents the dermatology specialty, and consider making a donation. Emergency medicine physicians created the “Giving a Shift” campaign, which is a donation to their national political action committee of one shift’s earnings, and most of us could easily donate a half day’s income, as the only way to potentially change the increasingly onerous burdens on practitioners is through political action. As we say at RUC meetings, you can eat lunch or be lunch. The choice is yours.
Just as Charlie Brown looks forward to the coming of the Great Pumpkin each Halloween, those of us who dance in the minefields of payment policy await the publication of the Proposed Rule, more formally known as the “Revisions to Payment Policies under the Physician Fee Schedule and Other Revisions to Part B for CY 2017.”1,2 You could read the entire tome—a mere 316 pages (excluding the hundreds of pages of granular supplement data discussed in the last few columns)—or simply read what I have outlined as the good, the bad, and the ugly for the Proposed Rule for 2017.
The Good
In 2017, dermatology will increase its share of the pie by 1% to $3.505 billion of a total $89.467 billion expected to be expended for physician services.1 The effect on individual providers will vary by geographic location and practice mix. Half is from the 0.5% increase that has come to all physicians across the board as mandated by the Medicare Access and CHIP Reauthorization Act (MACRA).3
Current Procedural Terminology (CPT) codes for reflectance confocal microscopy (96931–96936) will have Centers for Medicare & Medicaid Services valuations beginning in 2017, and individuals performing this service should be able to report it and be paid for their efforts.1 The values are below the American Medical Association/Specialty Society Relative Value Scale Update Committee (RUC) recommendations.
The Bad
Payment rates for 2017 will be based on a conversion factor of 35.7751,1 a drop from the 2016 conversion factor of 35.8043. Cuts will be made for some specialties. Gastroenterology, nephrology, neurosurgery, radiology, urology, and radiation therapy centers will take a 1% hit; ophthalmology, pathology, and vascular surgery will take 2% cuts; and interventional radiology will lose 7%.1 A special case within dermatology and pathology is a 15% cut to the technical component of slide preparation for CPT code 883054 due to a redefinition of the valuation of eosin stains.2 While the accuracy and precision of the value of these practice expense inputs can be debated, the government by definition makes the rules and involved specialties had an opportunity to appeal this change through the comment process that ended on September 6, 2016. The government can take comments into account, but substantial changes usually are not made from the Proposed Rule to the Final Rule, which usually arrives around the beginning of November; however, in an election year, the Final Rule can be a few weeks late.
The Ugly
The government will increase its unfunded mandates with the creation of new Medicare G codes (global services codes) that will allow the government to track the provision of postoperative care for all 010 and 090 global service periods (Table 1). The codes look mostly at time and do not clearly take into account the severity or complexity of the conditions being cared for and will be reported on claim forms as an unfunded mandate with more confusion and cost.1 Because not all claim-paying intermediaries are likely to have these G codes smoothly set up in their systems, there will still be a cost to filing the claim. Unless changes occur in the Final Rule, which is unlikely, there will be no payment for the time and effort of submitting these claims. The goal of the US Government is to hone in on postoperative services and parse them down so they can cut payments wherever possible beginning in 2019.1 Everyone wants to save money, from the consumer5 to the payer, and the ultimate payer is playing hardball. Additional validation efforts likely will lower physician fee-for-service payments further.
The US Government also is taking a shot at what they call “misvalued services” that have not had recent refinement within the RUC process.1 The work list for 2017 includes a number of 000 global period codes where additional evaluation and management services are reported using modifier -25, which implies a substantial, separately identifiable cognitive service performed by the same physician on the day of a procedure above and beyond other services provided or beyond the usual preservice and postservice care associated with the procedure that was performed. Although codes such as biopsies (11100 and 11101) and premalignant destructions (17000–17004) have an adjustment built in and dermatologists who provide services on the same day are actually penalized for the multiple built-in reductions that are already additive, the government is concerned that 19% of the 000 global services were billed more than 50% of the time with an evaluation and management code with modifier -25. Eighty-three codes met the criteria for which the government believes it may be overpaying1; the codes of interest to dermatology are shown in Table 2.1
The refinement of global periods will be an ongoing exercise through 2017, and beyond, with results likely to play an important role in the 2019 fee schedule. These global period reviews combined with some Stark law refinement relating the leasing of space at market rates while disallowing the landlord physician from receiving patient referrals from the tenant may also affect practitioner income.1,6 I never cease to be amazed that former Congressman Fortney Hillman “Pete” Stark (D), who has an antikickback scheme that keeps expanding, never went after the banking and brokerage industries. The founder of the $1.1 billion Security National Bank, a small bank in Walnut Creek, California,7 never focused on regulating banks. In his 40-year congressional career, he decided physicians make better targets. His efforts have not helped physicians but have helped lawyers, as he is quick to acknowledge.8
Final Thoughts
I end this column with an appeal to the dermatologists of America. Go to the American Academy of Dermatology Association Political Action Committee website (https://skinpac.org/), the home page for the only political action committee that represents the dermatology specialty, and consider making a donation. Emergency medicine physicians created the “Giving a Shift” campaign, which is a donation to their national political action committee of one shift’s earnings, and most of us could easily donate a half day’s income, as the only way to potentially change the increasingly onerous burdens on practitioners is through political action. As we say at RUC meetings, you can eat lunch or be lunch. The choice is yours.
- Medicare Program; Revisions to Payment Policies Under the Physician Fee Schedule and Other Revisions to Part B for CY 2017; Medicare Advantage Pricing Data Release; Medicare Advantage and Part D Medical Low Ratio Data Release; Medicare Advantage Provider Network Requirements; Expansion of Medicare Diabetes Prevention Program Model. Fed Regist. 2016;81(136):46162-46476. To be codified at 42 CFR §405, 410, 411, et al. https://www.gpo.gov/fdsys/pkg/FR-2016-07-15/pdf/2016-16097.pdf. Accessed September 7, 2016.
- Revisions to Payment Policies under the Physician Fee Schedule and Other Revisions to Part B for CY 2017. Centers for Medicare & Medicaid Services website. https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/PhysicianFeeSched/PFS-Federal-Regulation-Notices-Items/CMS-1654-P.html. Accessed September 7, 2016.
- Text of the Medicare Access and CHIP Reauthorization Act of 2015. GovTrack website. https://www.govtrack.us/congress/bills/114/hr2/text. Accessed September 9, 2016.
- Kaplan KJ. Proposed Medicare 2017 reimbursement schedule whacks biopsy payments; digital pathology payments up. Digital Pathology Blog website. http://tissuepathology.com/2016/07/20/proposed-medicare-2017-reimbursement-schedule-whacks-biopsy-payments-digital-pathology-payments-up/#ixzz4HEqBLgzu. Published July 20, 2016. Accessed September 7, 2016.
- Abelson R. Cost, not choice, is top concern of health insurance customers. The New York Times. http://www.nytimes.com/2016/08/13/business/cost-not-choice-is-top-concern-of-health-insurance-customers.html?_r=0. Published August 12, 2016. Accessed September 7, 2016.
- Stark Law website. http://starklaw.org/. Accessed September 7, 2016.
- Pete Stark. Freedom From Religion website. https://ffrf.org/news/day/dayitems/item/14800-pete-stark. Accessed September 19, 2016.
- Adamy J. Pete Stark: Law regulating doctors mostly helped lawyers. The Wall Street Journal. October 22, 2014. http://blogs.wsj.com/washwire/2014/10/22/pete-stark-law-regulating-doctors-mostly-helped-lawyers/. Accessed September 19, 2016.
- Medicare Program; Revisions to Payment Policies Under the Physician Fee Schedule and Other Revisions to Part B for CY 2017; Medicare Advantage Pricing Data Release; Medicare Advantage and Part D Medical Low Ratio Data Release; Medicare Advantage Provider Network Requirements; Expansion of Medicare Diabetes Prevention Program Model. Fed Regist. 2016;81(136):46162-46476. To be codified at 42 CFR §405, 410, 411, et al. https://www.gpo.gov/fdsys/pkg/FR-2016-07-15/pdf/2016-16097.pdf. Accessed September 7, 2016.
- Revisions to Payment Policies under the Physician Fee Schedule and Other Revisions to Part B for CY 2017. Centers for Medicare & Medicaid Services website. https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/PhysicianFeeSched/PFS-Federal-Regulation-Notices-Items/CMS-1654-P.html. Accessed September 7, 2016.
- Text of the Medicare Access and CHIP Reauthorization Act of 2015. GovTrack website. https://www.govtrack.us/congress/bills/114/hr2/text. Accessed September 9, 2016.
- Kaplan KJ. Proposed Medicare 2017 reimbursement schedule whacks biopsy payments; digital pathology payments up. Digital Pathology Blog website. http://tissuepathology.com/2016/07/20/proposed-medicare-2017-reimbursement-schedule-whacks-biopsy-payments-digital-pathology-payments-up/#ixzz4HEqBLgzu. Published July 20, 2016. Accessed September 7, 2016.
- Abelson R. Cost, not choice, is top concern of health insurance customers. The New York Times. http://www.nytimes.com/2016/08/13/business/cost-not-choice-is-top-concern-of-health-insurance-customers.html?_r=0. Published August 12, 2016. Accessed September 7, 2016.
- Stark Law website. http://starklaw.org/. Accessed September 7, 2016.
- Pete Stark. Freedom From Religion website. https://ffrf.org/news/day/dayitems/item/14800-pete-stark. Accessed September 19, 2016.
- Adamy J. Pete Stark: Law regulating doctors mostly helped lawyers. The Wall Street Journal. October 22, 2014. http://blogs.wsj.com/washwire/2014/10/22/pete-stark-law-regulating-doctors-mostly-helped-lawyers/. Accessed September 19, 2016.
Practice Points
- The Proposed Rule outlines the probable payment levels for calendar year 2017.
- The rule also announces how the Medicare Access and CHIP Reauthorization Act (MACRA) may be implemented.
Prostate cancer recurrence rates low with SBRT
BOSTON – For men with newly diagnosed low- or intermediate-risk prostate cancer, stereotactic body radiotherapy (SBRT) in the right hands can be safe, with low radiation-associated toxicities and with cancer control rates that compare favorably with those produced with external-beam radiotherapy (EBRT), investigators in a multicenter study report.
At 5-year follow-up, there were no grade 4 toxicities, no treatment-related deaths, and just five grade 3 adverse events that occurred in 4 out of 309 patients treated with SBRT, said Robert Meier, MD, at the annual meeting of the American Society for Radiation Oncology.
Using a standard radiology definition of recurrence as a more than 2 ng/mL increase in prostate-specific antigen (PSA) levels over posttreatment nadir, 97.1% of all patients were recurrence free at 5 years.
Among 172 patients with low-risk disease (T1b-T2, Gleason 6 or less and PSA 10 ng/mL or less), 97.3% were recurrence-free at 5 years, which compares favorably with the 93% seen in combined data from three large clinical trials of dose-escalated EBRT, Dr. Meier said.
For the 137 patients with intermediate risk disease (T1B-T2b, Gleason = 7, and PSA of 10 or less, or Gleason 6 or lower with a PSA between 10 and 20), 97% were recurrence-free at 5 years, a result “that matches the best results in radiotherapy for intermediate-risk patients, and matches the best results for, for example, dose-escalated IMRT [intensity-modulated radiation therapy],” he commented.
“The data is very encouraging,” commented Colleen Lawton, MD, professor and vice chair of radiation oncology at the Medical College of Wisconsin in Milwaukee.
To determine the safety and efficacy of SBRT in men with newly diagnosed prostate cancer, Dr. Meier and coinvestigators at six centers in the United States designed a prospective study.
A total of 309 patients were enrolled, and all were treated with SBRT delivered in 5 fractions of 8 Gy each over 5 days with a robotic linear accelerator that tracks the prostate, and corrections for motion in three spatial dimension, as well as yaw, pitch, and roll.
The treatment-delivery pattern is shaped to constrain doses to the bladder, rectum, testes, and penile bulb.
Using standard dosimetry calculation, the total actual radiation dose delivered to the prostate is equivalent to approximately 100 Gy, Dr. Meier said.
The safety analysis was powered to consider a greater than 10% rate of grade 3-5 urinary or bowel side effects as excessive. The efficacy analysis was designed to ask whether 5-year recurrence-free rates in low-risk patients could equal or be superior to a historical control rate of 93%.
As noted before, there were no grade 4 toxicities and no treatment-related deaths, and the rate of grade 3 side effects was 2.7%. with two events occurring in low-risk patients, and three in intermediate-risk patients. The events, all genitourinary toxicities, occurred from 11 to 51 months after treatment. Grade 1 or 2 genitourinary toxicities at any time were seen in 53% and 35% of patients, respectively. Grade 1 or 2 GI toxicities were seen in 59% and 10%.
Five patients developed urinary retention which required temporary catheter placement.
The ideal candidate for the therapy is the unfavorable intermediate-risk patient, Dr. Meier said in the interview.
“These are the patients who, if they are going to get external-beam radiation, have to combine it with androgen ablation, and that has its own toxicities. SBRT did very well even in the unfavorable intermediate-risk patients, so I think that group, and for that matter any intermediate-risk patient, is ideally suited,” he said.
The study was supported by Accuray. Dr. Meier disclosed research grants from the company. Dr. Lawton reported no relevant financial disclosures.
BOSTON – For men with newly diagnosed low- or intermediate-risk prostate cancer, stereotactic body radiotherapy (SBRT) in the right hands can be safe, with low radiation-associated toxicities and with cancer control rates that compare favorably with those produced with external-beam radiotherapy (EBRT), investigators in a multicenter study report.
At 5-year follow-up, there were no grade 4 toxicities, no treatment-related deaths, and just five grade 3 adverse events that occurred in 4 out of 309 patients treated with SBRT, said Robert Meier, MD, at the annual meeting of the American Society for Radiation Oncology.
Using a standard radiology definition of recurrence as a more than 2 ng/mL increase in prostate-specific antigen (PSA) levels over posttreatment nadir, 97.1% of all patients were recurrence free at 5 years.
Among 172 patients with low-risk disease (T1b-T2, Gleason 6 or less and PSA 10 ng/mL or less), 97.3% were recurrence-free at 5 years, which compares favorably with the 93% seen in combined data from three large clinical trials of dose-escalated EBRT, Dr. Meier said.
For the 137 patients with intermediate risk disease (T1B-T2b, Gleason = 7, and PSA of 10 or less, or Gleason 6 or lower with a PSA between 10 and 20), 97% were recurrence-free at 5 years, a result “that matches the best results in radiotherapy for intermediate-risk patients, and matches the best results for, for example, dose-escalated IMRT [intensity-modulated radiation therapy],” he commented.
“The data is very encouraging,” commented Colleen Lawton, MD, professor and vice chair of radiation oncology at the Medical College of Wisconsin in Milwaukee.
To determine the safety and efficacy of SBRT in men with newly diagnosed prostate cancer, Dr. Meier and coinvestigators at six centers in the United States designed a prospective study.
A total of 309 patients were enrolled, and all were treated with SBRT delivered in 5 fractions of 8 Gy each over 5 days with a robotic linear accelerator that tracks the prostate, and corrections for motion in three spatial dimension, as well as yaw, pitch, and roll.
The treatment-delivery pattern is shaped to constrain doses to the bladder, rectum, testes, and penile bulb.
Using standard dosimetry calculation, the total actual radiation dose delivered to the prostate is equivalent to approximately 100 Gy, Dr. Meier said.
The safety analysis was powered to consider a greater than 10% rate of grade 3-5 urinary or bowel side effects as excessive. The efficacy analysis was designed to ask whether 5-year recurrence-free rates in low-risk patients could equal or be superior to a historical control rate of 93%.
As noted before, there were no grade 4 toxicities and no treatment-related deaths, and the rate of grade 3 side effects was 2.7%. with two events occurring in low-risk patients, and three in intermediate-risk patients. The events, all genitourinary toxicities, occurred from 11 to 51 months after treatment. Grade 1 or 2 genitourinary toxicities at any time were seen in 53% and 35% of patients, respectively. Grade 1 or 2 GI toxicities were seen in 59% and 10%.
Five patients developed urinary retention which required temporary catheter placement.
The ideal candidate for the therapy is the unfavorable intermediate-risk patient, Dr. Meier said in the interview.
“These are the patients who, if they are going to get external-beam radiation, have to combine it with androgen ablation, and that has its own toxicities. SBRT did very well even in the unfavorable intermediate-risk patients, so I think that group, and for that matter any intermediate-risk patient, is ideally suited,” he said.
The study was supported by Accuray. Dr. Meier disclosed research grants from the company. Dr. Lawton reported no relevant financial disclosures.
BOSTON – For men with newly diagnosed low- or intermediate-risk prostate cancer, stereotactic body radiotherapy (SBRT) in the right hands can be safe, with low radiation-associated toxicities and with cancer control rates that compare favorably with those produced with external-beam radiotherapy (EBRT), investigators in a multicenter study report.
At 5-year follow-up, there were no grade 4 toxicities, no treatment-related deaths, and just five grade 3 adverse events that occurred in 4 out of 309 patients treated with SBRT, said Robert Meier, MD, at the annual meeting of the American Society for Radiation Oncology.
Using a standard radiology definition of recurrence as a more than 2 ng/mL increase in prostate-specific antigen (PSA) levels over posttreatment nadir, 97.1% of all patients were recurrence free at 5 years.
Among 172 patients with low-risk disease (T1b-T2, Gleason 6 or less and PSA 10 ng/mL or less), 97.3% were recurrence-free at 5 years, which compares favorably with the 93% seen in combined data from three large clinical trials of dose-escalated EBRT, Dr. Meier said.
For the 137 patients with intermediate risk disease (T1B-T2b, Gleason = 7, and PSA of 10 or less, or Gleason 6 or lower with a PSA between 10 and 20), 97% were recurrence-free at 5 years, a result “that matches the best results in radiotherapy for intermediate-risk patients, and matches the best results for, for example, dose-escalated IMRT [intensity-modulated radiation therapy],” he commented.
“The data is very encouraging,” commented Colleen Lawton, MD, professor and vice chair of radiation oncology at the Medical College of Wisconsin in Milwaukee.
To determine the safety and efficacy of SBRT in men with newly diagnosed prostate cancer, Dr. Meier and coinvestigators at six centers in the United States designed a prospective study.
A total of 309 patients were enrolled, and all were treated with SBRT delivered in 5 fractions of 8 Gy each over 5 days with a robotic linear accelerator that tracks the prostate, and corrections for motion in three spatial dimension, as well as yaw, pitch, and roll.
The treatment-delivery pattern is shaped to constrain doses to the bladder, rectum, testes, and penile bulb.
Using standard dosimetry calculation, the total actual radiation dose delivered to the prostate is equivalent to approximately 100 Gy, Dr. Meier said.
The safety analysis was powered to consider a greater than 10% rate of grade 3-5 urinary or bowel side effects as excessive. The efficacy analysis was designed to ask whether 5-year recurrence-free rates in low-risk patients could equal or be superior to a historical control rate of 93%.
As noted before, there were no grade 4 toxicities and no treatment-related deaths, and the rate of grade 3 side effects was 2.7%. with two events occurring in low-risk patients, and three in intermediate-risk patients. The events, all genitourinary toxicities, occurred from 11 to 51 months after treatment. Grade 1 or 2 genitourinary toxicities at any time were seen in 53% and 35% of patients, respectively. Grade 1 or 2 GI toxicities were seen in 59% and 10%.
Five patients developed urinary retention which required temporary catheter placement.
The ideal candidate for the therapy is the unfavorable intermediate-risk patient, Dr. Meier said in the interview.
“These are the patients who, if they are going to get external-beam radiation, have to combine it with androgen ablation, and that has its own toxicities. SBRT did very well even in the unfavorable intermediate-risk patients, so I think that group, and for that matter any intermediate-risk patient, is ideally suited,” he said.
The study was supported by Accuray. Dr. Meier disclosed research grants from the company. Dr. Lawton reported no relevant financial disclosures.
Key clinical point:
Major finding: The 5-year recurrence-free rate for low-risk patients was 97.3%, surpassing the 93% seen with historic controls.
Data source: Prospective study in 309 patients treated at six U.S. centers.
Disclosures: The study was supported by Accuray. Dr. Meier disclosed research grants from the company. Dr. Lawton reported no relevant financial disclosures.
‘Bionic pancreas’ employs glucagon and insulin to stabilize blood sugar
MUNICH – A “bionic pancreas” that delivers glucagon as well as insulin fared well against conventional insulin pump therapy, significantly reducing mean glucose levels and minimizing time spent in hypoglycemia.
The iLet bionic pancreas is being developed by Beta Bionics in conjunction with Eli Lilly and with support from the National Institutes of Health. Beta Bionics bills itself as a public benefit corporation – “a for-profit entity also dedicated to social responsibility and a public benefit mission,” according to an article published by Boston University.
Glucagon is the key that takes the bionic pancreas above and beyond what current insulin pump systems can do, said Dr. Russell, an endocrinologist at Massachusetts General Hospital, Boston.
“Even the pancreas, which has all the advantages of releasing insulin right into the portal vein and directly sensing glucose in the blood, uses the countering hormone, glucagon, to prevent hypoglycemia, particularly during exercise and in the late postprandial phase,” he said. “We are trying to mimic that capability. Glucagon gives us an additional tool to further reduce the risk of hypoglycemia.”
The algorithm is almost completely independent of user input – another key benefit, Dr. Russell said. It initializes with input about the patient’s weight and adapts its insulin delivery by essentially learning each user. There is no need to count carbohydrates, for example. The system “learns” over time its user’s typical meal patterns and can be programmed to deliver insulin accordingly.
“The user can enter, for example, ‘typical lunch,’ and the system will dispense some insulin before the meal and the rest later, in automatic delivery mode.”
The iLet has been studied in several settings and populations including, most recently, a successful crossover trial in 19 children at a diabetes camp (Lancet Diabetes Endocrinol. 2016; 4[3]:233-43).
The device was similarly successful in both teens and adults.
The study Dr. Russell reported at EASD 2016 comprised 39 patients with type 1 diabetes who were using an insulin pump. The mean age of the patients was 33 years, and mean disease duration was 17 years. The mean daily insulin dose was 0.6 U/kg. The mean baseline hemoglobin A1c was 7.7% and mean blood glucose, 176 mg/dL.
The crossover trial randomized patients to 11 days of treatment with their existing Medtronic insulin pump or the iLet system. The primary outcomes were the mean glucose level and time spent in hypoglycemic range (less than 60 mg/dL).
Under the usual care arm, “We saw a wide range of glucose values, which became much tighter when patients were using the bionic pancreas,” Dr. Russell said.
The overall average glucose level was 9 mmol/L in the usual care arm vs. 7.8 mmol/L in the bionic pancreas arm. Time in hypoglycemia was significantly reduced (27 minutes/24 hours vs. 9 minutes/24 hours).
“The standard deviation of the mean glucose was also larger in usual care, which is consistent with the automatic adaptive function of the bionic pancreas. If the mean glucose gets too high, it treats more aggressively; and if glucose is too low, it becomes less aggressive.”
The average amount of insulin used per day was similar (0.62 U/kg vs. 0.66 U/kg) and the average amount of glucagon used was 0.5 mg/day in each group.
During the night, the bionic pancreas kept mean blood glucose lower (134 mg/dL vs. 165 mg/dL) and more stable, reducing time in hypoglycemia (1.4 minutes/night vs. 4.8 minutes/night).
There was one incident of severe hypoglycemia in the usual-care arm, and none in the bionic-pancreas arm. There was a statistically significant increase in nausea associated with the glucagon, Dr. Russell noted, but the impact was still quite small. On a 0-10 rating scale, nausea in the bionic pancreas group was rated a 0.5 compared to a 0.05 in the usual care arm.
There were no changes in blood pressure, weight, or any lab test.
The ongoing studies continue to show “that automated bihormonal control of glycemia in the home-use setting is feasible,” Dr. Russell said. “Micro-dose glucagon was well tolerated and the bihormonal pancreas reduced both mean glucose and hypoglycemia relative to the patients’ usual care devices.”
The iLet continues to undergo modifications that are making it more user friendly, he added.
Dr. Stevens disclosed that he has a patent pending on “certain aspects” of the device, and that he is an unpaid scientific consultant for Beta Bionics. He also disclosed financial relationships with several pharmaceutical and device companies, including Eli Lilly and Medtronic.
MUNICH – A “bionic pancreas” that delivers glucagon as well as insulin fared well against conventional insulin pump therapy, significantly reducing mean glucose levels and minimizing time spent in hypoglycemia.
The iLet bionic pancreas is being developed by Beta Bionics in conjunction with Eli Lilly and with support from the National Institutes of Health. Beta Bionics bills itself as a public benefit corporation – “a for-profit entity also dedicated to social responsibility and a public benefit mission,” according to an article published by Boston University.
Glucagon is the key that takes the bionic pancreas above and beyond what current insulin pump systems can do, said Dr. Russell, an endocrinologist at Massachusetts General Hospital, Boston.
“Even the pancreas, which has all the advantages of releasing insulin right into the portal vein and directly sensing glucose in the blood, uses the countering hormone, glucagon, to prevent hypoglycemia, particularly during exercise and in the late postprandial phase,” he said. “We are trying to mimic that capability. Glucagon gives us an additional tool to further reduce the risk of hypoglycemia.”
The algorithm is almost completely independent of user input – another key benefit, Dr. Russell said. It initializes with input about the patient’s weight and adapts its insulin delivery by essentially learning each user. There is no need to count carbohydrates, for example. The system “learns” over time its user’s typical meal patterns and can be programmed to deliver insulin accordingly.
“The user can enter, for example, ‘typical lunch,’ and the system will dispense some insulin before the meal and the rest later, in automatic delivery mode.”
The iLet has been studied in several settings and populations including, most recently, a successful crossover trial in 19 children at a diabetes camp (Lancet Diabetes Endocrinol. 2016; 4[3]:233-43).
The device was similarly successful in both teens and adults.
The study Dr. Russell reported at EASD 2016 comprised 39 patients with type 1 diabetes who were using an insulin pump. The mean age of the patients was 33 years, and mean disease duration was 17 years. The mean daily insulin dose was 0.6 U/kg. The mean baseline hemoglobin A1c was 7.7% and mean blood glucose, 176 mg/dL.
The crossover trial randomized patients to 11 days of treatment with their existing Medtronic insulin pump or the iLet system. The primary outcomes were the mean glucose level and time spent in hypoglycemic range (less than 60 mg/dL).
Under the usual care arm, “We saw a wide range of glucose values, which became much tighter when patients were using the bionic pancreas,” Dr. Russell said.
The overall average glucose level was 9 mmol/L in the usual care arm vs. 7.8 mmol/L in the bionic pancreas arm. Time in hypoglycemia was significantly reduced (27 minutes/24 hours vs. 9 minutes/24 hours).
“The standard deviation of the mean glucose was also larger in usual care, which is consistent with the automatic adaptive function of the bionic pancreas. If the mean glucose gets too high, it treats more aggressively; and if glucose is too low, it becomes less aggressive.”
The average amount of insulin used per day was similar (0.62 U/kg vs. 0.66 U/kg) and the average amount of glucagon used was 0.5 mg/day in each group.
During the night, the bionic pancreas kept mean blood glucose lower (134 mg/dL vs. 165 mg/dL) and more stable, reducing time in hypoglycemia (1.4 minutes/night vs. 4.8 minutes/night).
There was one incident of severe hypoglycemia in the usual-care arm, and none in the bionic-pancreas arm. There was a statistically significant increase in nausea associated with the glucagon, Dr. Russell noted, but the impact was still quite small. On a 0-10 rating scale, nausea in the bionic pancreas group was rated a 0.5 compared to a 0.05 in the usual care arm.
There were no changes in blood pressure, weight, or any lab test.
The ongoing studies continue to show “that automated bihormonal control of glycemia in the home-use setting is feasible,” Dr. Russell said. “Micro-dose glucagon was well tolerated and the bihormonal pancreas reduced both mean glucose and hypoglycemia relative to the patients’ usual care devices.”
The iLet continues to undergo modifications that are making it more user friendly, he added.
Dr. Stevens disclosed that he has a patent pending on “certain aspects” of the device, and that he is an unpaid scientific consultant for Beta Bionics. He also disclosed financial relationships with several pharmaceutical and device companies, including Eli Lilly and Medtronic.
MUNICH – A “bionic pancreas” that delivers glucagon as well as insulin fared well against conventional insulin pump therapy, significantly reducing mean glucose levels and minimizing time spent in hypoglycemia.
The iLet bionic pancreas is being developed by Beta Bionics in conjunction with Eli Lilly and with support from the National Institutes of Health. Beta Bionics bills itself as a public benefit corporation – “a for-profit entity also dedicated to social responsibility and a public benefit mission,” according to an article published by Boston University.
Glucagon is the key that takes the bionic pancreas above and beyond what current insulin pump systems can do, said Dr. Russell, an endocrinologist at Massachusetts General Hospital, Boston.
“Even the pancreas, which has all the advantages of releasing insulin right into the portal vein and directly sensing glucose in the blood, uses the countering hormone, glucagon, to prevent hypoglycemia, particularly during exercise and in the late postprandial phase,” he said. “We are trying to mimic that capability. Glucagon gives us an additional tool to further reduce the risk of hypoglycemia.”
The algorithm is almost completely independent of user input – another key benefit, Dr. Russell said. It initializes with input about the patient’s weight and adapts its insulin delivery by essentially learning each user. There is no need to count carbohydrates, for example. The system “learns” over time its user’s typical meal patterns and can be programmed to deliver insulin accordingly.
“The user can enter, for example, ‘typical lunch,’ and the system will dispense some insulin before the meal and the rest later, in automatic delivery mode.”
The iLet has been studied in several settings and populations including, most recently, a successful crossover trial in 19 children at a diabetes camp (Lancet Diabetes Endocrinol. 2016; 4[3]:233-43).
The device was similarly successful in both teens and adults.
The study Dr. Russell reported at EASD 2016 comprised 39 patients with type 1 diabetes who were using an insulin pump. The mean age of the patients was 33 years, and mean disease duration was 17 years. The mean daily insulin dose was 0.6 U/kg. The mean baseline hemoglobin A1c was 7.7% and mean blood glucose, 176 mg/dL.
The crossover trial randomized patients to 11 days of treatment with their existing Medtronic insulin pump or the iLet system. The primary outcomes were the mean glucose level and time spent in hypoglycemic range (less than 60 mg/dL).
Under the usual care arm, “We saw a wide range of glucose values, which became much tighter when patients were using the bionic pancreas,” Dr. Russell said.
The overall average glucose level was 9 mmol/L in the usual care arm vs. 7.8 mmol/L in the bionic pancreas arm. Time in hypoglycemia was significantly reduced (27 minutes/24 hours vs. 9 minutes/24 hours).
“The standard deviation of the mean glucose was also larger in usual care, which is consistent with the automatic adaptive function of the bionic pancreas. If the mean glucose gets too high, it treats more aggressively; and if glucose is too low, it becomes less aggressive.”
The average amount of insulin used per day was similar (0.62 U/kg vs. 0.66 U/kg) and the average amount of glucagon used was 0.5 mg/day in each group.
During the night, the bionic pancreas kept mean blood glucose lower (134 mg/dL vs. 165 mg/dL) and more stable, reducing time in hypoglycemia (1.4 minutes/night vs. 4.8 minutes/night).
There was one incident of severe hypoglycemia in the usual-care arm, and none in the bionic-pancreas arm. There was a statistically significant increase in nausea associated with the glucagon, Dr. Russell noted, but the impact was still quite small. On a 0-10 rating scale, nausea in the bionic pancreas group was rated a 0.5 compared to a 0.05 in the usual care arm.
There were no changes in blood pressure, weight, or any lab test.
The ongoing studies continue to show “that automated bihormonal control of glycemia in the home-use setting is feasible,” Dr. Russell said. “Micro-dose glucagon was well tolerated and the bihormonal pancreas reduced both mean glucose and hypoglycemia relative to the patients’ usual care devices.”
The iLet continues to undergo modifications that are making it more user friendly, he added.
Dr. Stevens disclosed that he has a patent pending on “certain aspects” of the device, and that he is an unpaid scientific consultant for Beta Bionics. He also disclosed financial relationships with several pharmaceutical and device companies, including Eli Lilly and Medtronic.
Key clinical point:
Major finding: The overall average glucose level was 9 mmol/L in the usual care arm vs. 7.8 mmol/L in the bionic pancreas arm.
Data source: The randomized crossover trial comprised 39 patients with type 1 diabetes.
Disclosures: Dr. Stevens disclosed that he has a patent pending on “certain aspects” of the device and that he is an unpaid scientific consultant for Beta Bionics. He disclosed financial relationships with several pharmaceutical and device companies, including Eli Lilly and Medtronic.
Direct-acting antivirals: One of several keys to HCV eradication by 2030
Elimination of the public health threat posed by the hepatitis C virus (HCV) might seem impossible to achieve by 2030, but researchers in Italy say it can be done.
Important elements of success will include the use of oral direct-acting antivirals (DAAs) and a global commitment to prevention.
Earlier this year, the World Health Organization announced plans to wipe out HCV worldwide by 2030 using the time between now and 2021 to reduce the number of annual new
Success in meeting the WHO challenge will hinge largely on the dramatic scale-up of new oral DAAs, according to Simone Lanini, MD, an epidemiologist at the National Institute for Infectious Diseases, Lazzaro Spallanzani-IRCCS, in Rome, and his coauthors. They’ve written a detailed analysis of all available tools and impending obstacles in the global fight against the virus.
With clinical trials consistently demonstrating HCV cure rates in excess of 85%, these short-duration oral treatment courses that are optimally tolerated with no absolute contraindications “offer hope,” especially in combination with best practices in primary prevention, wrote Dr. Lanini and his colleagues.
DAAs – combination therapy of nucleotide analogue inhibitors NS5B and NS5A – are viable treatments across all hepatitis C virus genotypes and are indicated for patients regardless of their potential stage of liver disease, or whether they have failed prior treatments.
Access to these therapies, however, remains at issue.
“We have effective treatments in the form of DAAs but, currently, these are neither affordable nor accessible in many low- and middle-income countries,” study coauthor and scientific director at the Institute, Giuseppe Ippolito, MD, said in a statement. “Global pressure will be required to encourage generic competition to reduce the cost of medicines and diagnostics. This could include direct price negotiations with the pharmaceutical companies responsible for DAA manufacture, differential pricing, [or] voluntary licenses.”
Avoiding the spread of infection will be another key to overcoming HCV, particularly in several African nations such as Nigeria and Egypt, and other lower- and middle-income countries like India, where prevention measures such as screening donated blood for viral contamination are sparse. Worldwide, there is a need for better implementation of protocols to avoid unsafe injections, according to the study authors.
There is also a need for global cooperation and sharing of best practices among nations of all income levels to reduce HCV transmission across high-risk populations such as intravenous drug users and prisoners. Because mother-to-infant transmission prevention measures are essentially ineffective, Dr. Lanini and his colleagues said perinatal prevention of HCV infection should be emphasized. Tattoo and other cosmetic procedures including circumcision are also of concern, the authors wrote, particularly in Western Africa.
Controlling an infectious disease is one thing, but eradicating it takes an entirely different level of commitment, according to Dr. Lanini and his colleagues. There must be an effective intervention that disrupts transmission, such as the DAAs and accurate screening and diagnosis. The infection also must occur only in humans. Additionally, there needs to be a widely held belief among leaders at all levels of government that stopping infection is a relevant public concern; prevention and intervention strategies must meet economic constraints; and epidemiologic support – including access to screening and treatment and tracking of infectious cases – must be in place across all regions, the authors wrote.
Given that these criteria are met, a road map for success largely already exists, according to Dr. Ippolito. “[We] can learn from the innovative HIV service delivery approaches that have already been used successfully in marginalized and vulnerable populations across the world,” he said in the statement.
[email protected]
On Twitter @whitneymcknight
Elimination of the public health threat posed by the hepatitis C virus (HCV) might seem impossible to achieve by 2030, but researchers in Italy say it can be done.
Important elements of success will include the use of oral direct-acting antivirals (DAAs) and a global commitment to prevention.
Earlier this year, the World Health Organization announced plans to wipe out HCV worldwide by 2030 using the time between now and 2021 to reduce the number of annual new
Success in meeting the WHO challenge will hinge largely on the dramatic scale-up of new oral DAAs, according to Simone Lanini, MD, an epidemiologist at the National Institute for Infectious Diseases, Lazzaro Spallanzani-IRCCS, in Rome, and his coauthors. They’ve written a detailed analysis of all available tools and impending obstacles in the global fight against the virus.
With clinical trials consistently demonstrating HCV cure rates in excess of 85%, these short-duration oral treatment courses that are optimally tolerated with no absolute contraindications “offer hope,” especially in combination with best practices in primary prevention, wrote Dr. Lanini and his colleagues.
DAAs – combination therapy of nucleotide analogue inhibitors NS5B and NS5A – are viable treatments across all hepatitis C virus genotypes and are indicated for patients regardless of their potential stage of liver disease, or whether they have failed prior treatments.
Access to these therapies, however, remains at issue.
“We have effective treatments in the form of DAAs but, currently, these are neither affordable nor accessible in many low- and middle-income countries,” study coauthor and scientific director at the Institute, Giuseppe Ippolito, MD, said in a statement. “Global pressure will be required to encourage generic competition to reduce the cost of medicines and diagnostics. This could include direct price negotiations with the pharmaceutical companies responsible for DAA manufacture, differential pricing, [or] voluntary licenses.”
Avoiding the spread of infection will be another key to overcoming HCV, particularly in several African nations such as Nigeria and Egypt, and other lower- and middle-income countries like India, where prevention measures such as screening donated blood for viral contamination are sparse. Worldwide, there is a need for better implementation of protocols to avoid unsafe injections, according to the study authors.
There is also a need for global cooperation and sharing of best practices among nations of all income levels to reduce HCV transmission across high-risk populations such as intravenous drug users and prisoners. Because mother-to-infant transmission prevention measures are essentially ineffective, Dr. Lanini and his colleagues said perinatal prevention of HCV infection should be emphasized. Tattoo and other cosmetic procedures including circumcision are also of concern, the authors wrote, particularly in Western Africa.
Controlling an infectious disease is one thing, but eradicating it takes an entirely different level of commitment, according to Dr. Lanini and his colleagues. There must be an effective intervention that disrupts transmission, such as the DAAs and accurate screening and diagnosis. The infection also must occur only in humans. Additionally, there needs to be a widely held belief among leaders at all levels of government that stopping infection is a relevant public concern; prevention and intervention strategies must meet economic constraints; and epidemiologic support – including access to screening and treatment and tracking of infectious cases – must be in place across all regions, the authors wrote.
Given that these criteria are met, a road map for success largely already exists, according to Dr. Ippolito. “[We] can learn from the innovative HIV service delivery approaches that have already been used successfully in marginalized and vulnerable populations across the world,” he said in the statement.
[email protected]
On Twitter @whitneymcknight
Elimination of the public health threat posed by the hepatitis C virus (HCV) might seem impossible to achieve by 2030, but researchers in Italy say it can be done.
Important elements of success will include the use of oral direct-acting antivirals (DAAs) and a global commitment to prevention.
Earlier this year, the World Health Organization announced plans to wipe out HCV worldwide by 2030 using the time between now and 2021 to reduce the number of annual new
Success in meeting the WHO challenge will hinge largely on the dramatic scale-up of new oral DAAs, according to Simone Lanini, MD, an epidemiologist at the National Institute for Infectious Diseases, Lazzaro Spallanzani-IRCCS, in Rome, and his coauthors. They’ve written a detailed analysis of all available tools and impending obstacles in the global fight against the virus.
With clinical trials consistently demonstrating HCV cure rates in excess of 85%, these short-duration oral treatment courses that are optimally tolerated with no absolute contraindications “offer hope,” especially in combination with best practices in primary prevention, wrote Dr. Lanini and his colleagues.
DAAs – combination therapy of nucleotide analogue inhibitors NS5B and NS5A – are viable treatments across all hepatitis C virus genotypes and are indicated for patients regardless of their potential stage of liver disease, or whether they have failed prior treatments.
Access to these therapies, however, remains at issue.
“We have effective treatments in the form of DAAs but, currently, these are neither affordable nor accessible in many low- and middle-income countries,” study coauthor and scientific director at the Institute, Giuseppe Ippolito, MD, said in a statement. “Global pressure will be required to encourage generic competition to reduce the cost of medicines and diagnostics. This could include direct price negotiations with the pharmaceutical companies responsible for DAA manufacture, differential pricing, [or] voluntary licenses.”
Avoiding the spread of infection will be another key to overcoming HCV, particularly in several African nations such as Nigeria and Egypt, and other lower- and middle-income countries like India, where prevention measures such as screening donated blood for viral contamination are sparse. Worldwide, there is a need for better implementation of protocols to avoid unsafe injections, according to the study authors.
There is also a need for global cooperation and sharing of best practices among nations of all income levels to reduce HCV transmission across high-risk populations such as intravenous drug users and prisoners. Because mother-to-infant transmission prevention measures are essentially ineffective, Dr. Lanini and his colleagues said perinatal prevention of HCV infection should be emphasized. Tattoo and other cosmetic procedures including circumcision are also of concern, the authors wrote, particularly in Western Africa.
Controlling an infectious disease is one thing, but eradicating it takes an entirely different level of commitment, according to Dr. Lanini and his colleagues. There must be an effective intervention that disrupts transmission, such as the DAAs and accurate screening and diagnosis. The infection also must occur only in humans. Additionally, there needs to be a widely held belief among leaders at all levels of government that stopping infection is a relevant public concern; prevention and intervention strategies must meet economic constraints; and epidemiologic support – including access to screening and treatment and tracking of infectious cases – must be in place across all regions, the authors wrote.
Given that these criteria are met, a road map for success largely already exists, according to Dr. Ippolito. “[We] can learn from the innovative HIV service delivery approaches that have already been used successfully in marginalized and vulnerable populations across the world,” he said in the statement.
[email protected]
On Twitter @whitneymcknight
New antimalarial drug proves promising in phase II trial
Recent research suggests the novel antimalarial agent KAF156 is effective without visible safety concerns in adults with uncomplicated Plasmodium vivax or P. falciparum malaria, according to a study published in the New England Journal of Medicine.
From March to August 2013, 21 adults with acute uncomplicated malaria (11 with P. vivax malaria and 10 with P. falciparum malaria) were enrolled in multiple-dose cohorts (400 mg of KAF156 given once daily for 3 days). A third cohort of patients was treated with a single 800-mg dose of KAF156 in order to assess the cure rate at 28 days and the potential for a single-dose cure.
Among the 21 patients with P. falciparum malaria who received the single 800-mg dose and were followed for 28 days, 1 had reinfection and 7 had recrudescent infections (cure rate, 67%). Gametocytemia was detected in two of the patients with P. vivax malaria at baseline and cleared in both patients within 16 hours after receipt of KAF156. In the patients with P. falciparum malaria, one patient had gametocytemia from baseline to 54 hours after receiving the dose and one had intermittent gametocytemia from baseline until 72 hours after dose administration.
The investigators also reported two patients had posttreatment gametocytemia – one had a single positive reading at 24 hours, and the other had positive readings from 12 to 96 hours, at which time sampling finished. Most patients had at least one adverse event, although no grade 4 or serious adverse events were noted. Overall, there were more adverse events after the single 800-mg dose than after multiple 400-mg doses.
“New antimalarial drugs are needed as artemisinin resistance spreads in Southeast Asia and partner-drug resistance follows.” the researchers concluded. “Our study showed that KAF156, a new antimalarial drug, has activity against vivax and falciparum malaria, including artemisinin-resistant parasites.”
Read the full study in the New England Journal of Medicine (2016 Sep. 21. doi: 10.1056/NEJMoa1602250).
Recent research suggests the novel antimalarial agent KAF156 is effective without visible safety concerns in adults with uncomplicated Plasmodium vivax or P. falciparum malaria, according to a study published in the New England Journal of Medicine.
From March to August 2013, 21 adults with acute uncomplicated malaria (11 with P. vivax malaria and 10 with P. falciparum malaria) were enrolled in multiple-dose cohorts (400 mg of KAF156 given once daily for 3 days). A third cohort of patients was treated with a single 800-mg dose of KAF156 in order to assess the cure rate at 28 days and the potential for a single-dose cure.
Among the 21 patients with P. falciparum malaria who received the single 800-mg dose and were followed for 28 days, 1 had reinfection and 7 had recrudescent infections (cure rate, 67%). Gametocytemia was detected in two of the patients with P. vivax malaria at baseline and cleared in both patients within 16 hours after receipt of KAF156. In the patients with P. falciparum malaria, one patient had gametocytemia from baseline to 54 hours after receiving the dose and one had intermittent gametocytemia from baseline until 72 hours after dose administration.
The investigators also reported two patients had posttreatment gametocytemia – one had a single positive reading at 24 hours, and the other had positive readings from 12 to 96 hours, at which time sampling finished. Most patients had at least one adverse event, although no grade 4 or serious adverse events were noted. Overall, there were more adverse events after the single 800-mg dose than after multiple 400-mg doses.
“New antimalarial drugs are needed as artemisinin resistance spreads in Southeast Asia and partner-drug resistance follows.” the researchers concluded. “Our study showed that KAF156, a new antimalarial drug, has activity against vivax and falciparum malaria, including artemisinin-resistant parasites.”
Read the full study in the New England Journal of Medicine (2016 Sep. 21. doi: 10.1056/NEJMoa1602250).
Recent research suggests the novel antimalarial agent KAF156 is effective without visible safety concerns in adults with uncomplicated Plasmodium vivax or P. falciparum malaria, according to a study published in the New England Journal of Medicine.
From March to August 2013, 21 adults with acute uncomplicated malaria (11 with P. vivax malaria and 10 with P. falciparum malaria) were enrolled in multiple-dose cohorts (400 mg of KAF156 given once daily for 3 days). A third cohort of patients was treated with a single 800-mg dose of KAF156 in order to assess the cure rate at 28 days and the potential for a single-dose cure.
Among the 21 patients with P. falciparum malaria who received the single 800-mg dose and were followed for 28 days, 1 had reinfection and 7 had recrudescent infections (cure rate, 67%). Gametocytemia was detected in two of the patients with P. vivax malaria at baseline and cleared in both patients within 16 hours after receipt of KAF156. In the patients with P. falciparum malaria, one patient had gametocytemia from baseline to 54 hours after receiving the dose and one had intermittent gametocytemia from baseline until 72 hours after dose administration.
The investigators also reported two patients had posttreatment gametocytemia – one had a single positive reading at 24 hours, and the other had positive readings from 12 to 96 hours, at which time sampling finished. Most patients had at least one adverse event, although no grade 4 or serious adverse events were noted. Overall, there were more adverse events after the single 800-mg dose than after multiple 400-mg doses.
“New antimalarial drugs are needed as artemisinin resistance spreads in Southeast Asia and partner-drug resistance follows.” the researchers concluded. “Our study showed that KAF156, a new antimalarial drug, has activity against vivax and falciparum malaria, including artemisinin-resistant parasites.”
Read the full study in the New England Journal of Medicine (2016 Sep. 21. doi: 10.1056/NEJMoa1602250).