FDA, hospitals caution against laparoscopic power morcellation during hysterectomy and myomectomy

Article Type
Changed
Tue, 08/28/2018 - 11:03
Display Headline
FDA, hospitals caution against laparoscopic power morcellation during hysterectomy and myomectomy

The use of power morcellation to remove the uterus or uterine tumors during hysterectomy and myomectomy may be riskier than many have thought. That’s the conclusion reached by the US Food and Drug Administration (FDA) in a safety communication issued April 17, 2014. In its communication, the FDA “discouraged” use of power morcellation during hysterectomy and myomectomy. Shortly afterward, Brigham and Women’s and Massachusetts General hospitals in Boston banned power morcellation in all hysterectomy and myomectomy procedures. The hospitals may resume power morcellation at some future date using a containment system, pending guidance from the Institutional Review Board.

Robert L. Barbieri, MD, who is chair of obstetrics and gynecology at Brigham and Women’s Hospital, recently wrote about this concern for OBG Management in his capacity as editor in chief of the journal.

“When used to treat tumors presumed to be fibroids, open power morcellation [without a containment system] is associated with an increased risk of disper­sing benign myoma tissue and occult malignant leiomyosarcoma tissue throughout the abdominal cavity,” he wrote.1 “Dispersion of benign myoma tissue may result in the growth of fibroids on the peritoneal surface, omentum, and bowel, causing abdominal and pelvic pain and necessitating reoperation. Dispersion of leiomyosarcoma tissue throughout the abdominal cavity may result in a Stage I cancer being upstaged to a Stage IV malignancy, requiring additional surgery and chemotherapy. In cases in which open power morcellation causes the upstaging of a leiomyosarcoma, the death rate is increased.”1

The two Boston hospitals are not the only institutions reconsidering the use of power morcellation. Temple University Hospital in Philadelphia banned use of the procedure without a containment system in late February 2014.

And in December 2013, the Society of Gynecologic Oncology issued a position statement on the issue, which said, “power morcellation or other techniques that cut up the uterus in the abdomen have the potential to disseminate an otherwise contained malignancy throughout the abdominal cavity. For this reason, the Society of Gynecologic Oncology (SGO) asserts that it is generally contraindicated in the presence of documented or highly suspected malignancy, and may be inadvisable in premalignant conditions or risk-reducing surgery.”2

For its part, at the time of this writing, the AAGL, previously known as the American Association of Gynecologic Laparoscopists, “is reviewing the scientific evidence and best practices reported by our members,” stated an article in its Association News. “We recognize that, in rare cases, the use of power morcellators can lead to the dissemination of an occult malignancy of endometrial or myometrial origin, and also to dissemination of benign morcellated tissues. We encourage our members to fully research and understand the risks of power morcellation and to learn more about when alternative methods of tissue extraction may be appropriate.”3

FDA STOPS SHORT OF A BAN
In laying out its concerns, the FDA stopped short of an outright ban on power morcellation. Instead, it stated that, “based on currently available information, the FDA discourages the use of laparoscopic power morcellation during hysterectomy or myomectomy for uterine fibroids.”4

It also noted that approximately 1 in 350 women “undergoing hysterectomy or myomectomy for the treatment of fibroids is found to have an unsuspected uterine sarcoma.”4

Among its recommendations for health-care providers:

  • avoid laparoscopic uterine power morcellation in women with suspected or known uterine cancer
  • carefully consider all available treatment options for women with symptomatic uterine fibroids
  • thoroughly discuss the benefits and risks of all treatments with patients.4

The FDA also noted that “some clinicians and medical institutions now advocate using a specimen ‘bag’ during morcellation in an attempt to contain the uterine tissue and minimize the risk of spread in the abdomen and pelvis.”4

ACOG HAS YET TO WEIGH IN
At the time of this writing, the most recent committee opinion on choosing a hysterectomy route from the American College of Obstetricians and Gynecologists (ACOG) to touch on the issue states that, “the decision to perform a hysterectomy via [minimally invasive surgery] (with or without morcellation) is based on a patient evaluation, including the patient’s history and general health, tests, and procedures, such as pre-surgery biopsies. The evaluation and diagnostic process also provides an opportunity to identify any cautions or contraindications, such as finding a gynecological cancer.”5

FILLING THE TECHNOLOGY GAP
Now that power morcellation appears to be receding as an option for minimally invasive gynecologic surgeons, what is the best approach?

In its position statement, the SGO recommends that, “Patients being considered for minimally invasive surgery performed by laparoscopic or robotic techniques who might require intracorporeal morcellation should be appropriately evaluated for the possibility of coexisting uterine or cervical malignancy. Other options to intracorporeal morcellation include removing the uterus through a mini-laparotomy or morcellating the uterus inside a laparoscopic bag.”2

 

 

K. Anthony Shibley, MD, a Minneapolis-area ObGyn, has developed a novel strategy to prevent tissue dissemination during open power morcellation, which is demonstrated in a video at obgmanagement.com. Similarly, Ceana Nezhat, MD, and Erica Dun, MD, demonstrate enclosed vaginal morcellation of a large uterus. Click here to access these and other features in the Morcellation Topic Collection.

WE WANT TO HEAR FROM YOU!
Drop us a line and let us know what you think about this or other current articles, which topics you'd like to see covered in future issues, and what challenges you face in daily practice. Tell us what you think by emailing us at: [email protected] Please include your name, and the city and state in which you practice.

References

  1. Barbieri RL. Options for reducing the use of open power morcellation of uterine tumors. OBG Manag. 2014;26(3):10,11,20.
  2. Society of Gynecologic Oncology. Position Statement: Morcellation. https://www.sgo.org/newsroom/position-statements-2/morcellation/. Published December 2013. Accessed April 8, 2014.
  3. AAGL Member Update: Disseminated Leiomyosarcoma with Power Morcellation. http://www.aagl.org/aaglnews/aagl-member-update-disseminated-leiomyosarcoma-with-power-morcellation/. Accessed April 11, 2014.
  4. US Food and Drug Administration. Laparoscopic uterine power morcellation in hysterectomy and myomectomy: FDA safety communication. April 17, 2014.
  5. American College of Obstetricians and Gynecologists. Committee Opinion No. 444: Choosing the route of hysterectomy for benign disease. Obstet Gynecol. 2009;114(5):1156–1158.
Article PDF
Author and Disclosure Information

Janelle Yates, Senior Editor

Issue
OBG Management - 26(5)
Publications
Topics
Page Number
68,67
Legacy Keywords
Janelle Yates,Robert L. Barbieri,K. Anthony Shibley,Ceana Nezhat,Erica Dun,laparoscopic power morcellation,intracorporeal morcellation,hysterectomy,myomectomy,US Food and Drug Administration,FDA,Brigham and Women’s Hospital,Massachusetts General Hospital,Institutional Review Board,leiomyosarcoma,Temple University Hospital,Society of Gynecologic Oncology,SGO,AAGL,American College of Obstetricians and Gynecologists,ACOG,laparoscopic bag
Sections
Author and Disclosure Information

Janelle Yates, Senior Editor

Author and Disclosure Information

Janelle Yates, Senior Editor

Article PDF
Article PDF
Related Articles

The use of power morcellation to remove the uterus or uterine tumors during hysterectomy and myomectomy may be riskier than many have thought. That’s the conclusion reached by the US Food and Drug Administration (FDA) in a safety communication issued April 17, 2014. In its communication, the FDA “discouraged” use of power morcellation during hysterectomy and myomectomy. Shortly afterward, Brigham and Women’s and Massachusetts General hospitals in Boston banned power morcellation in all hysterectomy and myomectomy procedures. The hospitals may resume power morcellation at some future date using a containment system, pending guidance from the Institutional Review Board.

Robert L. Barbieri, MD, who is chair of obstetrics and gynecology at Brigham and Women’s Hospital, recently wrote about this concern for OBG Management in his capacity as editor in chief of the journal.

“When used to treat tumors presumed to be fibroids, open power morcellation [without a containment system] is associated with an increased risk of disper­sing benign myoma tissue and occult malignant leiomyosarcoma tissue throughout the abdominal cavity,” he wrote.1 “Dispersion of benign myoma tissue may result in the growth of fibroids on the peritoneal surface, omentum, and bowel, causing abdominal and pelvic pain and necessitating reoperation. Dispersion of leiomyosarcoma tissue throughout the abdominal cavity may result in a Stage I cancer being upstaged to a Stage IV malignancy, requiring additional surgery and chemotherapy. In cases in which open power morcellation causes the upstaging of a leiomyosarcoma, the death rate is increased.”1

The two Boston hospitals are not the only institutions reconsidering the use of power morcellation. Temple University Hospital in Philadelphia banned use of the procedure without a containment system in late February 2014.

And in December 2013, the Society of Gynecologic Oncology issued a position statement on the issue, which said, “power morcellation or other techniques that cut up the uterus in the abdomen have the potential to disseminate an otherwise contained malignancy throughout the abdominal cavity. For this reason, the Society of Gynecologic Oncology (SGO) asserts that it is generally contraindicated in the presence of documented or highly suspected malignancy, and may be inadvisable in premalignant conditions or risk-reducing surgery.”2

For its part, at the time of this writing, the AAGL, previously known as the American Association of Gynecologic Laparoscopists, “is reviewing the scientific evidence and best practices reported by our members,” stated an article in its Association News. “We recognize that, in rare cases, the use of power morcellators can lead to the dissemination of an occult malignancy of endometrial or myometrial origin, and also to dissemination of benign morcellated tissues. We encourage our members to fully research and understand the risks of power morcellation and to learn more about when alternative methods of tissue extraction may be appropriate.”3

FDA STOPS SHORT OF A BAN
In laying out its concerns, the FDA stopped short of an outright ban on power morcellation. Instead, it stated that, “based on currently available information, the FDA discourages the use of laparoscopic power morcellation during hysterectomy or myomectomy for uterine fibroids.”4

It also noted that approximately 1 in 350 women “undergoing hysterectomy or myomectomy for the treatment of fibroids is found to have an unsuspected uterine sarcoma.”4

Among its recommendations for health-care providers:

  • avoid laparoscopic uterine power morcellation in women with suspected or known uterine cancer
  • carefully consider all available treatment options for women with symptomatic uterine fibroids
  • thoroughly discuss the benefits and risks of all treatments with patients.4

The FDA also noted that “some clinicians and medical institutions now advocate using a specimen ‘bag’ during morcellation in an attempt to contain the uterine tissue and minimize the risk of spread in the abdomen and pelvis.”4

ACOG HAS YET TO WEIGH IN
At the time of this writing, the most recent committee opinion on choosing a hysterectomy route from the American College of Obstetricians and Gynecologists (ACOG) to touch on the issue states that, “the decision to perform a hysterectomy via [minimally invasive surgery] (with or without morcellation) is based on a patient evaluation, including the patient’s history and general health, tests, and procedures, such as pre-surgery biopsies. The evaluation and diagnostic process also provides an opportunity to identify any cautions or contraindications, such as finding a gynecological cancer.”5

FILLING THE TECHNOLOGY GAP
Now that power morcellation appears to be receding as an option for minimally invasive gynecologic surgeons, what is the best approach?

In its position statement, the SGO recommends that, “Patients being considered for minimally invasive surgery performed by laparoscopic or robotic techniques who might require intracorporeal morcellation should be appropriately evaluated for the possibility of coexisting uterine or cervical malignancy. Other options to intracorporeal morcellation include removing the uterus through a mini-laparotomy or morcellating the uterus inside a laparoscopic bag.”2

 

 

K. Anthony Shibley, MD, a Minneapolis-area ObGyn, has developed a novel strategy to prevent tissue dissemination during open power morcellation, which is demonstrated in a video at obgmanagement.com. Similarly, Ceana Nezhat, MD, and Erica Dun, MD, demonstrate enclosed vaginal morcellation of a large uterus. Click here to access these and other features in the Morcellation Topic Collection.

WE WANT TO HEAR FROM YOU!
Drop us a line and let us know what you think about this or other current articles, which topics you'd like to see covered in future issues, and what challenges you face in daily practice. Tell us what you think by emailing us at: [email protected] Please include your name, and the city and state in which you practice.

The use of power morcellation to remove the uterus or uterine tumors during hysterectomy and myomectomy may be riskier than many have thought. That’s the conclusion reached by the US Food and Drug Administration (FDA) in a safety communication issued April 17, 2014. In its communication, the FDA “discouraged” use of power morcellation during hysterectomy and myomectomy. Shortly afterward, Brigham and Women’s and Massachusetts General hospitals in Boston banned power morcellation in all hysterectomy and myomectomy procedures. The hospitals may resume power morcellation at some future date using a containment system, pending guidance from the Institutional Review Board.

Robert L. Barbieri, MD, who is chair of obstetrics and gynecology at Brigham and Women’s Hospital, recently wrote about this concern for OBG Management in his capacity as editor in chief of the journal.

“When used to treat tumors presumed to be fibroids, open power morcellation [without a containment system] is associated with an increased risk of disper­sing benign myoma tissue and occult malignant leiomyosarcoma tissue throughout the abdominal cavity,” he wrote.1 “Dispersion of benign myoma tissue may result in the growth of fibroids on the peritoneal surface, omentum, and bowel, causing abdominal and pelvic pain and necessitating reoperation. Dispersion of leiomyosarcoma tissue throughout the abdominal cavity may result in a Stage I cancer being upstaged to a Stage IV malignancy, requiring additional surgery and chemotherapy. In cases in which open power morcellation causes the upstaging of a leiomyosarcoma, the death rate is increased.”1

The two Boston hospitals are not the only institutions reconsidering the use of power morcellation. Temple University Hospital in Philadelphia banned use of the procedure without a containment system in late February 2014.

And in December 2013, the Society of Gynecologic Oncology issued a position statement on the issue, which said, “power morcellation or other techniques that cut up the uterus in the abdomen have the potential to disseminate an otherwise contained malignancy throughout the abdominal cavity. For this reason, the Society of Gynecologic Oncology (SGO) asserts that it is generally contraindicated in the presence of documented or highly suspected malignancy, and may be inadvisable in premalignant conditions or risk-reducing surgery.”2

For its part, at the time of this writing, the AAGL, previously known as the American Association of Gynecologic Laparoscopists, “is reviewing the scientific evidence and best practices reported by our members,” stated an article in its Association News. “We recognize that, in rare cases, the use of power morcellators can lead to the dissemination of an occult malignancy of endometrial or myometrial origin, and also to dissemination of benign morcellated tissues. We encourage our members to fully research and understand the risks of power morcellation and to learn more about when alternative methods of tissue extraction may be appropriate.”3

FDA STOPS SHORT OF A BAN
In laying out its concerns, the FDA stopped short of an outright ban on power morcellation. Instead, it stated that, “based on currently available information, the FDA discourages the use of laparoscopic power morcellation during hysterectomy or myomectomy for uterine fibroids.”4

It also noted that approximately 1 in 350 women “undergoing hysterectomy or myomectomy for the treatment of fibroids is found to have an unsuspected uterine sarcoma.”4

Among its recommendations for health-care providers:

  • avoid laparoscopic uterine power morcellation in women with suspected or known uterine cancer
  • carefully consider all available treatment options for women with symptomatic uterine fibroids
  • thoroughly discuss the benefits and risks of all treatments with patients.4

The FDA also noted that “some clinicians and medical institutions now advocate using a specimen ‘bag’ during morcellation in an attempt to contain the uterine tissue and minimize the risk of spread in the abdomen and pelvis.”4

ACOG HAS YET TO WEIGH IN
At the time of this writing, the most recent committee opinion on choosing a hysterectomy route from the American College of Obstetricians and Gynecologists (ACOG) to touch on the issue states that, “the decision to perform a hysterectomy via [minimally invasive surgery] (with or without morcellation) is based on a patient evaluation, including the patient’s history and general health, tests, and procedures, such as pre-surgery biopsies. The evaluation and diagnostic process also provides an opportunity to identify any cautions or contraindications, such as finding a gynecological cancer.”5

FILLING THE TECHNOLOGY GAP
Now that power morcellation appears to be receding as an option for minimally invasive gynecologic surgeons, what is the best approach?

In its position statement, the SGO recommends that, “Patients being considered for minimally invasive surgery performed by laparoscopic or robotic techniques who might require intracorporeal morcellation should be appropriately evaluated for the possibility of coexisting uterine or cervical malignancy. Other options to intracorporeal morcellation include removing the uterus through a mini-laparotomy or morcellating the uterus inside a laparoscopic bag.”2

 

 

K. Anthony Shibley, MD, a Minneapolis-area ObGyn, has developed a novel strategy to prevent tissue dissemination during open power morcellation, which is demonstrated in a video at obgmanagement.com. Similarly, Ceana Nezhat, MD, and Erica Dun, MD, demonstrate enclosed vaginal morcellation of a large uterus. Click here to access these and other features in the Morcellation Topic Collection.

WE WANT TO HEAR FROM YOU!
Drop us a line and let us know what you think about this or other current articles, which topics you'd like to see covered in future issues, and what challenges you face in daily practice. Tell us what you think by emailing us at: [email protected] Please include your name, and the city and state in which you practice.

References

  1. Barbieri RL. Options for reducing the use of open power morcellation of uterine tumors. OBG Manag. 2014;26(3):10,11,20.
  2. Society of Gynecologic Oncology. Position Statement: Morcellation. https://www.sgo.org/newsroom/position-statements-2/morcellation/. Published December 2013. Accessed April 8, 2014.
  3. AAGL Member Update: Disseminated Leiomyosarcoma with Power Morcellation. http://www.aagl.org/aaglnews/aagl-member-update-disseminated-leiomyosarcoma-with-power-morcellation/. Accessed April 11, 2014.
  4. US Food and Drug Administration. Laparoscopic uterine power morcellation in hysterectomy and myomectomy: FDA safety communication. April 17, 2014.
  5. American College of Obstetricians and Gynecologists. Committee Opinion No. 444: Choosing the route of hysterectomy for benign disease. Obstet Gynecol. 2009;114(5):1156–1158.
References

  1. Barbieri RL. Options for reducing the use of open power morcellation of uterine tumors. OBG Manag. 2014;26(3):10,11,20.
  2. Society of Gynecologic Oncology. Position Statement: Morcellation. https://www.sgo.org/newsroom/position-statements-2/morcellation/. Published December 2013. Accessed April 8, 2014.
  3. AAGL Member Update: Disseminated Leiomyosarcoma with Power Morcellation. http://www.aagl.org/aaglnews/aagl-member-update-disseminated-leiomyosarcoma-with-power-morcellation/. Accessed April 11, 2014.
  4. US Food and Drug Administration. Laparoscopic uterine power morcellation in hysterectomy and myomectomy: FDA safety communication. April 17, 2014.
  5. American College of Obstetricians and Gynecologists. Committee Opinion No. 444: Choosing the route of hysterectomy for benign disease. Obstet Gynecol. 2009;114(5):1156–1158.
Issue
OBG Management - 26(5)
Issue
OBG Management - 26(5)
Page Number
68,67
Page Number
68,67
Publications
Publications
Topics
Article Type
Display Headline
FDA, hospitals caution against laparoscopic power morcellation during hysterectomy and myomectomy
Display Headline
FDA, hospitals caution against laparoscopic power morcellation during hysterectomy and myomectomy
Legacy Keywords
Janelle Yates,Robert L. Barbieri,K. Anthony Shibley,Ceana Nezhat,Erica Dun,laparoscopic power morcellation,intracorporeal morcellation,hysterectomy,myomectomy,US Food and Drug Administration,FDA,Brigham and Women’s Hospital,Massachusetts General Hospital,Institutional Review Board,leiomyosarcoma,Temple University Hospital,Society of Gynecologic Oncology,SGO,AAGL,American College of Obstetricians and Gynecologists,ACOG,laparoscopic bag
Legacy Keywords
Janelle Yates,Robert L. Barbieri,K. Anthony Shibley,Ceana Nezhat,Erica Dun,laparoscopic power morcellation,intracorporeal morcellation,hysterectomy,myomectomy,US Food and Drug Administration,FDA,Brigham and Women’s Hospital,Massachusetts General Hospital,Institutional Review Board,leiomyosarcoma,Temple University Hospital,Society of Gynecologic Oncology,SGO,AAGL,American College of Obstetricians and Gynecologists,ACOG,laparoscopic bag
Sections
Article Source

PURLs Copyright

Inside the Article

To access articles, videos, and audiocasts in the Morcellation Topic Collection, click here.

Article PDF Media

Autism May Start in Utero

Article Type
Changed
Thu, 12/15/2022 - 16:18
Display Headline
Autism May Start in Utero

Autism may begin in utero, according to a study of postmortem brain tissue from children with and without autism published online ahead of print March 27 in the New England Journal of Medicine.

The findings imply that layer formation and layer-specific neuronal differentiation are dysregulated during prenatal development. The study also suggests that early recognition and treatment of autism may allow the developing brains of autistic children to construct alternative brain pathways around the patchy defects in the cortex. The result could be improved social functioning and communication, the researchers theorized.

Researchers used gene expression to examine cellular markers in each of the cortical layers, as well as genes that are associated with autism. Markers for several layers of the cortex were absent in the brain tissue of 10 of 11 (91%) children with autism and in one of 11 (9%) control children. The areas of disorganization were seen in multiple cortical layers, with most abnormal expression noted in layers 4 and 5 and focal disruption of cortical laminar architecture as patches that were 5 to 7 mm long.

Mary Jo M. Dales

References

Suggested Reading
Stoner R, Chow ML, Boyle MP, et al. Patches of disorganization in the neocortex of children with autism. N Engl J Med. 2014;370(13):1209-1219.

Author and Disclosure Information

Issue
Neurology Reviews - 22(5)
Publications
Topics
Page Number
36
Legacy Keywords
prenatal development, autism, neurology reviews, mary jo m. dales, cortex, abnormal expression
Sections
Author and Disclosure Information

Author and Disclosure Information

Related Articles

Autism may begin in utero, according to a study of postmortem brain tissue from children with and without autism published online ahead of print March 27 in the New England Journal of Medicine.

The findings imply that layer formation and layer-specific neuronal differentiation are dysregulated during prenatal development. The study also suggests that early recognition and treatment of autism may allow the developing brains of autistic children to construct alternative brain pathways around the patchy defects in the cortex. The result could be improved social functioning and communication, the researchers theorized.

Researchers used gene expression to examine cellular markers in each of the cortical layers, as well as genes that are associated with autism. Markers for several layers of the cortex were absent in the brain tissue of 10 of 11 (91%) children with autism and in one of 11 (9%) control children. The areas of disorganization were seen in multiple cortical layers, with most abnormal expression noted in layers 4 and 5 and focal disruption of cortical laminar architecture as patches that were 5 to 7 mm long.

Mary Jo M. Dales

Autism may begin in utero, according to a study of postmortem brain tissue from children with and without autism published online ahead of print March 27 in the New England Journal of Medicine.

The findings imply that layer formation and layer-specific neuronal differentiation are dysregulated during prenatal development. The study also suggests that early recognition and treatment of autism may allow the developing brains of autistic children to construct alternative brain pathways around the patchy defects in the cortex. The result could be improved social functioning and communication, the researchers theorized.

Researchers used gene expression to examine cellular markers in each of the cortical layers, as well as genes that are associated with autism. Markers for several layers of the cortex were absent in the brain tissue of 10 of 11 (91%) children with autism and in one of 11 (9%) control children. The areas of disorganization were seen in multiple cortical layers, with most abnormal expression noted in layers 4 and 5 and focal disruption of cortical laminar architecture as patches that were 5 to 7 mm long.

Mary Jo M. Dales

References

Suggested Reading
Stoner R, Chow ML, Boyle MP, et al. Patches of disorganization in the neocortex of children with autism. N Engl J Med. 2014;370(13):1209-1219.

References

Suggested Reading
Stoner R, Chow ML, Boyle MP, et al. Patches of disorganization in the neocortex of children with autism. N Engl J Med. 2014;370(13):1209-1219.

Issue
Neurology Reviews - 22(5)
Issue
Neurology Reviews - 22(5)
Page Number
36
Page Number
36
Publications
Publications
Topics
Article Type
Display Headline
Autism May Start in Utero
Display Headline
Autism May Start in Utero
Legacy Keywords
prenatal development, autism, neurology reviews, mary jo m. dales, cortex, abnormal expression
Legacy Keywords
prenatal development, autism, neurology reviews, mary jo m. dales, cortex, abnormal expression
Sections
Article Source

PURLs Copyright

Inside the Article

Prevalence of Autism Spectrum Disorder Is Increasing

Article Type
Changed
Thu, 12/15/2022 - 16:18
Display Headline
Prevalence of Autism Spectrum Disorder Is Increasing

The CDC estimates that about one in 68 US children has autism spectrum disorder, according to findings published in the March 28 issue of Morbidity and Mortality Weekly Report Surveillance Summaries. This prevalence is a 30% increase from the CDC’s estimate of one in 88 children using 2008 data.

The findings also show that autism spectrum disorder continues to be more prevalent in boys than in girls: one in 42 boys had autism spectrum disorder in the latest report, compared with one in 189 girls.

The increased prevalence could be attributed to improved clinician identification of autism, a growing number of autistic children with average to above-average intellectual ability, or a combination of both factors, said Coleen Boyle, PhD, Director of the CDC’s National Center on Birth Defects and Developmental Disabilities (NCBDDD).

The CDC analyzed 2010 data collected by its Autism and Developmental Disabilities Monitoring (ADDM) Network, which provides population-based estimates of autism spectrum disorder prevalence in children age 8 at 11 sites in the United States based on records from community sources that diagnose and provide services to children with developmental disabilities.

Of the 11 sites studied, seven had information available on the intellectual ability of at least 70% of children with autism spectrum disorder. Of the 3,604 children for whom data were available, 31% were classified as having intellectual disability (IQ of 70 or lower), 23% were considered borderline (IQ = 71 to 85), and 46% had IQ scores of greater than 85, considered average or above average intellectual ability.

“We recognize now that autism is a spectrum, no longer limited to the severely affected,” said Marshalyn Yeargin-Allsopp, MD, Chief of the Developmental Disabilities branch of NCBDDD. “There are children with higher IQs being diagnosed who may not even be receiving special education services, and the numbers may reflect that.”

Non-Hispanic white children were 30% more likely to be diagnosed with autism spectrum disorder than were non-Hispanic black children and about 50% more likely to be diagnosed with autism spectrum disorder than were Hispanic children.

Dr. Boyle stressed the importance of early screening and identification of autism spectrum disorder in children (it can be diagnosed by the time a child reaches age 2) and urged parents to take action if a child shows any signs of developmental delays.

“Community leaders, health professionals, educators, and childcare providers should use these data to ensure that children with autism spectrum disorder are identified as early as possible and connected to the services they need,” said Dr. Boyle.

To help promote early intervention in autism spectrum disorder, the CDC will be launching an awareness initiative called “Birth to Five, Watch Me Thrive,” which aims to provide parents, teachers, and community members with information and resources about developmental milestones and screening for autism.

“Most children with autism are not diagnosed until after age 4,” said Dr. Boyle. “The CDC will continue to promote early identification and research. The earlier a child is identified and connected with services, the better.”

The CDC cited several limitations to the report. First, the surveillance sites were not selected to be representative of the entire United States. Second, population denominators used for this report were based on the 2010 decennial census. Comparisons with previous ADDM findings thus should be interpreted with caution because ADDM reports from nondecennial surveillance years are likely influenced by greater error in the population denominators used for those previous surveillance years, which were based on postcensus estimates. Third, three of the nine sites with access to review children’s education records did not receive permission to do so in all school districts within the site’s overall surveillance area. Fourth, findings that address intellectual ability might not be generalizable to all ADDM sites. Finally, race and ethnicity are presented in broad terms and should not be interpreted as generalizable to all persons within those categories.

Madhu Rajaraman

References

Suggested Reading
Developmental Disabilities Monitoring Network Surveillance Year 2010 Principal Investigators. Prevalence of autism spectrum disorder among children aged 8 years—autism and developmental disabilities monitoring network, 11 sites, United States, 2010. MMWR Surveill Summ. 2014; Mar 28;63 Suppl 2:1-21.

Author and Disclosure Information

Issue
Neurology Reviews - 22(5)
Publications
Topics
Page Number
36
Legacy Keywords
boys, identification, CDC, neurology reviews, Marshalyn Yeargin-Allsopp, Madhu Rajaraman
Sections
Author and Disclosure Information

Author and Disclosure Information

The CDC estimates that about one in 68 US children has autism spectrum disorder, according to findings published in the March 28 issue of Morbidity and Mortality Weekly Report Surveillance Summaries. This prevalence is a 30% increase from the CDC’s estimate of one in 88 children using 2008 data.

The findings also show that autism spectrum disorder continues to be more prevalent in boys than in girls: one in 42 boys had autism spectrum disorder in the latest report, compared with one in 189 girls.

The increased prevalence could be attributed to improved clinician identification of autism, a growing number of autistic children with average to above-average intellectual ability, or a combination of both factors, said Coleen Boyle, PhD, Director of the CDC’s National Center on Birth Defects and Developmental Disabilities (NCBDDD).

The CDC analyzed 2010 data collected by its Autism and Developmental Disabilities Monitoring (ADDM) Network, which provides population-based estimates of autism spectrum disorder prevalence in children age 8 at 11 sites in the United States based on records from community sources that diagnose and provide services to children with developmental disabilities.

Of the 11 sites studied, seven had information available on the intellectual ability of at least 70% of children with autism spectrum disorder. Of the 3,604 children for whom data were available, 31% were classified as having intellectual disability (IQ of 70 or lower), 23% were considered borderline (IQ = 71 to 85), and 46% had IQ scores of greater than 85, considered average or above average intellectual ability.

“We recognize now that autism is a spectrum, no longer limited to the severely affected,” said Marshalyn Yeargin-Allsopp, MD, Chief of the Developmental Disabilities branch of NCBDDD. “There are children with higher IQs being diagnosed who may not even be receiving special education services, and the numbers may reflect that.”

Non-Hispanic white children were 30% more likely to be diagnosed with autism spectrum disorder than were non-Hispanic black children and about 50% more likely to be diagnosed with autism spectrum disorder than were Hispanic children.

Dr. Boyle stressed the importance of early screening and identification of autism spectrum disorder in children (it can be diagnosed by the time a child reaches age 2) and urged parents to take action if a child shows any signs of developmental delays.

“Community leaders, health professionals, educators, and childcare providers should use these data to ensure that children with autism spectrum disorder are identified as early as possible and connected to the services they need,” said Dr. Boyle.

To help promote early intervention in autism spectrum disorder, the CDC will be launching an awareness initiative called “Birth to Five, Watch Me Thrive,” which aims to provide parents, teachers, and community members with information and resources about developmental milestones and screening for autism.

“Most children with autism are not diagnosed until after age 4,” said Dr. Boyle. “The CDC will continue to promote early identification and research. The earlier a child is identified and connected with services, the better.”

The CDC cited several limitations to the report. First, the surveillance sites were not selected to be representative of the entire United States. Second, population denominators used for this report were based on the 2010 decennial census. Comparisons with previous ADDM findings thus should be interpreted with caution because ADDM reports from nondecennial surveillance years are likely influenced by greater error in the population denominators used for those previous surveillance years, which were based on postcensus estimates. Third, three of the nine sites with access to review children’s education records did not receive permission to do so in all school districts within the site’s overall surveillance area. Fourth, findings that address intellectual ability might not be generalizable to all ADDM sites. Finally, race and ethnicity are presented in broad terms and should not be interpreted as generalizable to all persons within those categories.

Madhu Rajaraman

The CDC estimates that about one in 68 US children has autism spectrum disorder, according to findings published in the March 28 issue of Morbidity and Mortality Weekly Report Surveillance Summaries. This prevalence is a 30% increase from the CDC’s estimate of one in 88 children using 2008 data.

The findings also show that autism spectrum disorder continues to be more prevalent in boys than in girls: one in 42 boys had autism spectrum disorder in the latest report, compared with one in 189 girls.

The increased prevalence could be attributed to improved clinician identification of autism, a growing number of autistic children with average to above-average intellectual ability, or a combination of both factors, said Coleen Boyle, PhD, Director of the CDC’s National Center on Birth Defects and Developmental Disabilities (NCBDDD).

The CDC analyzed 2010 data collected by its Autism and Developmental Disabilities Monitoring (ADDM) Network, which provides population-based estimates of autism spectrum disorder prevalence in children age 8 at 11 sites in the United States based on records from community sources that diagnose and provide services to children with developmental disabilities.

Of the 11 sites studied, seven had information available on the intellectual ability of at least 70% of children with autism spectrum disorder. Of the 3,604 children for whom data were available, 31% were classified as having intellectual disability (IQ of 70 or lower), 23% were considered borderline (IQ = 71 to 85), and 46% had IQ scores of greater than 85, considered average or above average intellectual ability.

“We recognize now that autism is a spectrum, no longer limited to the severely affected,” said Marshalyn Yeargin-Allsopp, MD, Chief of the Developmental Disabilities branch of NCBDDD. “There are children with higher IQs being diagnosed who may not even be receiving special education services, and the numbers may reflect that.”

Non-Hispanic white children were 30% more likely to be diagnosed with autism spectrum disorder than were non-Hispanic black children and about 50% more likely to be diagnosed with autism spectrum disorder than were Hispanic children.

Dr. Boyle stressed the importance of early screening and identification of autism spectrum disorder in children (it can be diagnosed by the time a child reaches age 2) and urged parents to take action if a child shows any signs of developmental delays.

“Community leaders, health professionals, educators, and childcare providers should use these data to ensure that children with autism spectrum disorder are identified as early as possible and connected to the services they need,” said Dr. Boyle.

To help promote early intervention in autism spectrum disorder, the CDC will be launching an awareness initiative called “Birth to Five, Watch Me Thrive,” which aims to provide parents, teachers, and community members with information and resources about developmental milestones and screening for autism.

“Most children with autism are not diagnosed until after age 4,” said Dr. Boyle. “The CDC will continue to promote early identification and research. The earlier a child is identified and connected with services, the better.”

The CDC cited several limitations to the report. First, the surveillance sites were not selected to be representative of the entire United States. Second, population denominators used for this report were based on the 2010 decennial census. Comparisons with previous ADDM findings thus should be interpreted with caution because ADDM reports from nondecennial surveillance years are likely influenced by greater error in the population denominators used for those previous surveillance years, which were based on postcensus estimates. Third, three of the nine sites with access to review children’s education records did not receive permission to do so in all school districts within the site’s overall surveillance area. Fourth, findings that address intellectual ability might not be generalizable to all ADDM sites. Finally, race and ethnicity are presented in broad terms and should not be interpreted as generalizable to all persons within those categories.

Madhu Rajaraman

References

Suggested Reading
Developmental Disabilities Monitoring Network Surveillance Year 2010 Principal Investigators. Prevalence of autism spectrum disorder among children aged 8 years—autism and developmental disabilities monitoring network, 11 sites, United States, 2010. MMWR Surveill Summ. 2014; Mar 28;63 Suppl 2:1-21.

References

Suggested Reading
Developmental Disabilities Monitoring Network Surveillance Year 2010 Principal Investigators. Prevalence of autism spectrum disorder among children aged 8 years—autism and developmental disabilities monitoring network, 11 sites, United States, 2010. MMWR Surveill Summ. 2014; Mar 28;63 Suppl 2:1-21.

Issue
Neurology Reviews - 22(5)
Issue
Neurology Reviews - 22(5)
Page Number
36
Page Number
36
Publications
Publications
Topics
Article Type
Display Headline
Prevalence of Autism Spectrum Disorder Is Increasing
Display Headline
Prevalence of Autism Spectrum Disorder Is Increasing
Legacy Keywords
boys, identification, CDC, neurology reviews, Marshalyn Yeargin-Allsopp, Madhu Rajaraman
Legacy Keywords
boys, identification, CDC, neurology reviews, Marshalyn Yeargin-Allsopp, Madhu Rajaraman
Sections
Article Source

PURLs Copyright

Inside the Article

Link Between PTSD and TBI Is Only the Beginning for MRS Study

Article Type
Changed
Tue, 12/13/2016 - 12:08
Display Headline
Link Between PTSD and TBI Is Only the Beginning for MRS Study

April 25, 2014

A fundamental challenge for any study examining the impact of military service on the health of military personnel is establishing a baseline. Whether heart disease or posttraumatic stress disorder (PTSD), the symptoms often appear after (sometimes long after) the service has ended. The longitudinal Marine Resiliency Study (MRS I) and its successor MRS II are seeking to resolve that issue in a novel approach that brings together the Department of Veterans Affairs, U.S. Marine Corps, and Navy Medicine. 

In the MRS study, a cohort of about 2,600 Marines (MRS-I) in 4 battalions and about 1,300 Marines (MRS-II) in 2 battalions deployed to Iraq or Afghanistan underwent a scientifically rigorous examination a month prior to deployment. This baseline was established using self-reported questionnaires, clinical interviews, and laboratory examinations. Follow-up examinations were repeated at 3 months (MRS-I and MRS-II) and again at 6 months post-deployment (MRS-I).

The program is ambitious, Dr. Dewleen Baker of the VA San Diego Health Care System told Federal Practitioner. “MRS was designed to provide broad-based (psychosocial, psychophysiological, and biological) prospective, longitudinal data, with a goal toward ultimate integrated analyses of variables, to determine risk and resilience for post-deployment mental health outcomes, i.e,. PTSD and co-occurring disorders,” she explained. “Analyses have just begun, and we are working our way through aspects of the data toward more integrated approaches.”

In one of the first of many reports to come out of MRS, the researchers found that the probability of developing PTSD was highest for participants with severe pre-deployment symptoms, high combat intensity, and deployment-related traumatic brain injury (TBI). Most significant, the researchers found that TBI doubled or nearly doubled the PTSD rates for participants with less severe pre-deployment PTSD symptoms. According to Baker:

Based on evidence from previous studies, we anticipated that prior psychiatric symptoms and combat intensity would be important predictors of post-deployment PTSD symptom severity. In addition, previous work has suggested a strong association between symptoms of TBI and PTSD but without a causal link. These 3 factors were all significant predictors of post-deployment symptoms in our study. An individual with no preexisting PTSD symptoms and low combat intensity is at minimal risk for developing PTSD (less than 1% probability). Unit increases in preexisting symptom scores and combat intensity modestly increase post-deployment symptom scores by 1% to 2%.

By contrast, deployment-related mild TBI increases post-deployment symptom scores by 23%, and moderate-to-severe injuries increase scores by 71%. Our findings suggest that TBI may be a very important risk factor of PTSD, even when accounting for preexisting symptoms and combat intensity.

Our study focused on the impact of pre-deployment symptoms, combat intensity and TBI; however, it is important to consider other factors of psychological risk and resilience. Genes, coping style, and social support are just a few of the many other factors that may influence an individual’s response to stress.

Creating a rigorous cross-agency research study required tact, diligence, and patience from the MRS team. “Each agency has their own unique culture and institutional rules, regulations, and bureaucracy, so ideas, programs, etc, must be vetted across all agencies and reconciled—the various cultures/agencies to be reconciled include DoD, VA and academia.” Baker explained. “In addition in regards to initiation of studies for MRS II, for the past couple years, we also interface with NIMH as well as Headquarters Marine Corps; NIMH has the role of scientific review of MRS-II studies carried out under Headquarters Marine Corps/BUMED funding.”

The MRS-I and II studies may very well provide a template for future studies. The MRS team included a military liaison to work with the active duty Marines and attached Sailors, gather data, schedule meetings, and to report findings. “This study has a lot of experience working within and across these agencies,” Baker noted, “It’s an excellent model for future VA/DOD joint projects.”

Author and Disclosure Information

By Reid A. Paul

Publications
Topics
Author and Disclosure Information

By Reid A. Paul

Author and Disclosure Information

By Reid A. Paul

April 25, 2014

A fundamental challenge for any study examining the impact of military service on the health of military personnel is establishing a baseline. Whether heart disease or posttraumatic stress disorder (PTSD), the symptoms often appear after (sometimes long after) the service has ended. The longitudinal Marine Resiliency Study (MRS I) and its successor MRS II are seeking to resolve that issue in a novel approach that brings together the Department of Veterans Affairs, U.S. Marine Corps, and Navy Medicine. 

In the MRS study, a cohort of about 2,600 Marines (MRS-I) in 4 battalions and about 1,300 Marines (MRS-II) in 2 battalions deployed to Iraq or Afghanistan underwent a scientifically rigorous examination a month prior to deployment. This baseline was established using self-reported questionnaires, clinical interviews, and laboratory examinations. Follow-up examinations were repeated at 3 months (MRS-I and MRS-II) and again at 6 months post-deployment (MRS-I).

The program is ambitious, Dr. Dewleen Baker of the VA San Diego Health Care System told Federal Practitioner. “MRS was designed to provide broad-based (psychosocial, psychophysiological, and biological) prospective, longitudinal data, with a goal toward ultimate integrated analyses of variables, to determine risk and resilience for post-deployment mental health outcomes, i.e,. PTSD and co-occurring disorders,” she explained. “Analyses have just begun, and we are working our way through aspects of the data toward more integrated approaches.”

In one of the first of many reports to come out of MRS, the researchers found that the probability of developing PTSD was highest for participants with severe pre-deployment symptoms, high combat intensity, and deployment-related traumatic brain injury (TBI). Most significant, the researchers found that TBI doubled or nearly doubled the PTSD rates for participants with less severe pre-deployment PTSD symptoms. According to Baker:

Based on evidence from previous studies, we anticipated that prior psychiatric symptoms and combat intensity would be important predictors of post-deployment PTSD symptom severity. In addition, previous work has suggested a strong association between symptoms of TBI and PTSD but without a causal link. These 3 factors were all significant predictors of post-deployment symptoms in our study. An individual with no preexisting PTSD symptoms and low combat intensity is at minimal risk for developing PTSD (less than 1% probability). Unit increases in preexisting symptom scores and combat intensity modestly increase post-deployment symptom scores by 1% to 2%.

By contrast, deployment-related mild TBI increases post-deployment symptom scores by 23%, and moderate-to-severe injuries increase scores by 71%. Our findings suggest that TBI may be a very important risk factor of PTSD, even when accounting for preexisting symptoms and combat intensity.

Our study focused on the impact of pre-deployment symptoms, combat intensity and TBI; however, it is important to consider other factors of psychological risk and resilience. Genes, coping style, and social support are just a few of the many other factors that may influence an individual’s response to stress.

Creating a rigorous cross-agency research study required tact, diligence, and patience from the MRS team. “Each agency has their own unique culture and institutional rules, regulations, and bureaucracy, so ideas, programs, etc, must be vetted across all agencies and reconciled—the various cultures/agencies to be reconciled include DoD, VA and academia.” Baker explained. “In addition in regards to initiation of studies for MRS II, for the past couple years, we also interface with NIMH as well as Headquarters Marine Corps; NIMH has the role of scientific review of MRS-II studies carried out under Headquarters Marine Corps/BUMED funding.”

The MRS-I and II studies may very well provide a template for future studies. The MRS team included a military liaison to work with the active duty Marines and attached Sailors, gather data, schedule meetings, and to report findings. “This study has a lot of experience working within and across these agencies,” Baker noted, “It’s an excellent model for future VA/DOD joint projects.”

April 25, 2014

A fundamental challenge for any study examining the impact of military service on the health of military personnel is establishing a baseline. Whether heart disease or posttraumatic stress disorder (PTSD), the symptoms often appear after (sometimes long after) the service has ended. The longitudinal Marine Resiliency Study (MRS I) and its successor MRS II are seeking to resolve that issue in a novel approach that brings together the Department of Veterans Affairs, U.S. Marine Corps, and Navy Medicine. 

In the MRS study, a cohort of about 2,600 Marines (MRS-I) in 4 battalions and about 1,300 Marines (MRS-II) in 2 battalions deployed to Iraq or Afghanistan underwent a scientifically rigorous examination a month prior to deployment. This baseline was established using self-reported questionnaires, clinical interviews, and laboratory examinations. Follow-up examinations were repeated at 3 months (MRS-I and MRS-II) and again at 6 months post-deployment (MRS-I).

The program is ambitious, Dr. Dewleen Baker of the VA San Diego Health Care System told Federal Practitioner. “MRS was designed to provide broad-based (psychosocial, psychophysiological, and biological) prospective, longitudinal data, with a goal toward ultimate integrated analyses of variables, to determine risk and resilience for post-deployment mental health outcomes, i.e,. PTSD and co-occurring disorders,” she explained. “Analyses have just begun, and we are working our way through aspects of the data toward more integrated approaches.”

In one of the first of many reports to come out of MRS, the researchers found that the probability of developing PTSD was highest for participants with severe pre-deployment symptoms, high combat intensity, and deployment-related traumatic brain injury (TBI). Most significant, the researchers found that TBI doubled or nearly doubled the PTSD rates for participants with less severe pre-deployment PTSD symptoms. According to Baker:

Based on evidence from previous studies, we anticipated that prior psychiatric symptoms and combat intensity would be important predictors of post-deployment PTSD symptom severity. In addition, previous work has suggested a strong association between symptoms of TBI and PTSD but without a causal link. These 3 factors were all significant predictors of post-deployment symptoms in our study. An individual with no preexisting PTSD symptoms and low combat intensity is at minimal risk for developing PTSD (less than 1% probability). Unit increases in preexisting symptom scores and combat intensity modestly increase post-deployment symptom scores by 1% to 2%.

By contrast, deployment-related mild TBI increases post-deployment symptom scores by 23%, and moderate-to-severe injuries increase scores by 71%. Our findings suggest that TBI may be a very important risk factor of PTSD, even when accounting for preexisting symptoms and combat intensity.

Our study focused on the impact of pre-deployment symptoms, combat intensity and TBI; however, it is important to consider other factors of psychological risk and resilience. Genes, coping style, and social support are just a few of the many other factors that may influence an individual’s response to stress.

Creating a rigorous cross-agency research study required tact, diligence, and patience from the MRS team. “Each agency has their own unique culture and institutional rules, regulations, and bureaucracy, so ideas, programs, etc, must be vetted across all agencies and reconciled—the various cultures/agencies to be reconciled include DoD, VA and academia.” Baker explained. “In addition in regards to initiation of studies for MRS II, for the past couple years, we also interface with NIMH as well as Headquarters Marine Corps; NIMH has the role of scientific review of MRS-II studies carried out under Headquarters Marine Corps/BUMED funding.”

The MRS-I and II studies may very well provide a template for future studies. The MRS team included a military liaison to work with the active duty Marines and attached Sailors, gather data, schedule meetings, and to report findings. “This study has a lot of experience working within and across these agencies,” Baker noted, “It’s an excellent model for future VA/DOD joint projects.”

Publications
Publications
Topics
Article Type
Display Headline
Link Between PTSD and TBI Is Only the Beginning for MRS Study
Display Headline
Link Between PTSD and TBI Is Only the Beginning for MRS Study
Article Source

PURLs Copyright

Inside the Article

Drug confers benefits for subset of AML patients

Article Type
Changed
Fri, 04/25/2014 - 05:00
Display Headline
Drug confers benefits for subset of AML patients

Patient receives chemotherapy

Credit: Rhoda Baer

A drug that combines 2 chemotherapy agents into 1 can be more effective than treatment with the individual agents in combination, results of a phase 2 study suggest.

The drug, CPX-351, is a fixed-ratio combination of cytarabine and daunorubicin inside a lipid vesicle.

In older patients with acute myeloid leukemia (AML), CPX-351 elicited a higher response rate than combination treatment with cytarabine and daunorubicin, although the difference was not significant.

Likewise, there were no significant differences in event-free survival (EFS) or overall survival (OS) between the 2 treatment groups.

However, CPX-351 conferred a significant response benefit among patients with poor cytogenetics and a significant survival benefit in patients with secondary AML (sAML).

Jeffrey Lancet, MD, of the Moffitt Cancer Center in Tampa, Florida, and his colleagues reported these results in Blood. The study was funded by Celator Pharmaceuticals, the company developing CPX-351.

Treatment details

The researchers analyzed 126 newly diagnosed AML patients who were 60 to 75 years of age.

Patients were randomized to receive CPX-351 (n=85) or “control” treatment consisting of cytarabine and daunorubicin (n=41). The 2 treatment groups were well-balanced for disease and patient characteristics at baseline.

As induction, patients in the CPX-351 arm received a 90-minute infusion of the drug at 100 units/m2 on days 1, 3, and 5 (delivering 100 mg/m2 cytarabine and 44 mg/m2 daunorubicin with each dose). Second induction and consolidation courses were given at 100 units/m2 on days 1 and 3.

Patients in the control arm received induction therapy consisting of cytarabine at 100 mg/m2/day by 7-day continuous infusion and daunorubicin at 60 mg/m2/day on days 1, 2, and 3. Daunorubicin could be reduced to 45 mg/m2/day at the investigator’s discretion for patients with advanced age, poor performance status, or reduced liver/kidney function.

The choice of consolidation therapy was at the investigator’s discretion as well. The recommended regimens included cytarabine at 100 to 200 mg/m2 for 5 to 7 days, with or without daunorubicin or intermediate-dose cytarabine (1.0 to 1.5 g/m2/dose).

Response and survival

The response rate was higher in the CPX-351 arm than in the control arm—66.7% and 51.2%, respectively (P=0.07), which met the predefined criterion for success (P<0.1). Response was defined as a complete response (CR) or a complete response with incomplete blood count recovery (CRi).

CRs occurred in 48.8% of patients in both arms. But CRis favored the CPX-351 arm over the control arm—17.9% and 2.4%, respectively.

Likewise, response rates favoring CPX-351 occurred in patients with adverse cytogenetics and sAML.

Among patients with adverse cytogenetics, the response rate was 77.3% in the CPX-351 arm and 38.5% in the control arm (P=0.03). And among patients with sAML, the response rate was 57.6% in the CPX-351 arm and 31.6% in the control arm (P=0.06).

The median OS was 14.7 months in the CPX-351 arm and 12.9 months in the control arm. The median EFS was 6.5 months and 2.0 months, respectively. These differences were not statistically significant.

However, sAML patients treated with CPX-351 had significantly better OS than sAML patients in the control arm. The median OS was 12.1 months and 6.1 months, respectively (P=0.01). And the median EFS was 4.5 months and 1.3 months, respectively (P=0.08).

Safety results

By day 60, 4.7% of patients in the CPX-351 arm and 14.6% of patients in the control arm had died. All of these deaths occurred in high-risk patients, particularly those with sAML.

Two patients died of intracranial hemorrhage during CPX-351 consolidation. One of these deaths was associated with head trauma and relapsed AML, and the other was from chemotherapy-induced thrombocytopenia.

 

 

For many of the most common adverse events, there were minimal differences between the treatment arms. These events included febrile neutropenia, infection, rash, diarrhea, nausea, edema, and constipation.

Patients in the CPX-351 arm had a higher incidence of grade 3-4 infection than controls—70.6% and 43.9%, respectively—but not infection-related deaths—3.5% and 7.3%, respectively.

The median time to neutrophil recovery (to ≥ 1000/μL) was longer in the CPX-351 arm than the control arm—36 days and 32 days, respectively. The same was true for platelet recovery (to ≥ 100,000/μL)—37 days and 28 days, respectively.

Researchers are now conducting a phase 3 trial of CPX-351, which is open and recruiting patients.

Publications
Topics

Patient receives chemotherapy

Credit: Rhoda Baer

A drug that combines 2 chemotherapy agents into 1 can be more effective than treatment with the individual agents in combination, results of a phase 2 study suggest.

The drug, CPX-351, is a fixed-ratio combination of cytarabine and daunorubicin inside a lipid vesicle.

In older patients with acute myeloid leukemia (AML), CPX-351 elicited a higher response rate than combination treatment with cytarabine and daunorubicin, although the difference was not significant.

Likewise, there were no significant differences in event-free survival (EFS) or overall survival (OS) between the 2 treatment groups.

However, CPX-351 conferred a significant response benefit among patients with poor cytogenetics and a significant survival benefit in patients with secondary AML (sAML).

Jeffrey Lancet, MD, of the Moffitt Cancer Center in Tampa, Florida, and his colleagues reported these results in Blood. The study was funded by Celator Pharmaceuticals, the company developing CPX-351.

Treatment details

The researchers analyzed 126 newly diagnosed AML patients who were 60 to 75 years of age.

Patients were randomized to receive CPX-351 (n=85) or “control” treatment consisting of cytarabine and daunorubicin (n=41). The 2 treatment groups were well-balanced for disease and patient characteristics at baseline.

As induction, patients in the CPX-351 arm received a 90-minute infusion of the drug at 100 units/m2 on days 1, 3, and 5 (delivering 100 mg/m2 cytarabine and 44 mg/m2 daunorubicin with each dose). Second induction and consolidation courses were given at 100 units/m2 on days 1 and 3.

Patients in the control arm received induction therapy consisting of cytarabine at 100 mg/m2/day by 7-day continuous infusion and daunorubicin at 60 mg/m2/day on days 1, 2, and 3. Daunorubicin could be reduced to 45 mg/m2/day at the investigator’s discretion for patients with advanced age, poor performance status, or reduced liver/kidney function.

The choice of consolidation therapy was at the investigator’s discretion as well. The recommended regimens included cytarabine at 100 to 200 mg/m2 for 5 to 7 days, with or without daunorubicin or intermediate-dose cytarabine (1.0 to 1.5 g/m2/dose).

Response and survival

The response rate was higher in the CPX-351 arm than in the control arm—66.7% and 51.2%, respectively (P=0.07), which met the predefined criterion for success (P<0.1). Response was defined as a complete response (CR) or a complete response with incomplete blood count recovery (CRi).

CRs occurred in 48.8% of patients in both arms. But CRis favored the CPX-351 arm over the control arm—17.9% and 2.4%, respectively.

Likewise, response rates favoring CPX-351 occurred in patients with adverse cytogenetics and sAML.

Among patients with adverse cytogenetics, the response rate was 77.3% in the CPX-351 arm and 38.5% in the control arm (P=0.03). And among patients with sAML, the response rate was 57.6% in the CPX-351 arm and 31.6% in the control arm (P=0.06).

The median OS was 14.7 months in the CPX-351 arm and 12.9 months in the control arm. The median EFS was 6.5 months and 2.0 months, respectively. These differences were not statistically significant.

However, sAML patients treated with CPX-351 had significantly better OS than sAML patients in the control arm. The median OS was 12.1 months and 6.1 months, respectively (P=0.01). And the median EFS was 4.5 months and 1.3 months, respectively (P=0.08).

Safety results

By day 60, 4.7% of patients in the CPX-351 arm and 14.6% of patients in the control arm had died. All of these deaths occurred in high-risk patients, particularly those with sAML.

Two patients died of intracranial hemorrhage during CPX-351 consolidation. One of these deaths was associated with head trauma and relapsed AML, and the other was from chemotherapy-induced thrombocytopenia.

 

 

For many of the most common adverse events, there were minimal differences between the treatment arms. These events included febrile neutropenia, infection, rash, diarrhea, nausea, edema, and constipation.

Patients in the CPX-351 arm had a higher incidence of grade 3-4 infection than controls—70.6% and 43.9%, respectively—but not infection-related deaths—3.5% and 7.3%, respectively.

The median time to neutrophil recovery (to ≥ 1000/μL) was longer in the CPX-351 arm than the control arm—36 days and 32 days, respectively. The same was true for platelet recovery (to ≥ 100,000/μL)—37 days and 28 days, respectively.

Researchers are now conducting a phase 3 trial of CPX-351, which is open and recruiting patients.

Patient receives chemotherapy

Credit: Rhoda Baer

A drug that combines 2 chemotherapy agents into 1 can be more effective than treatment with the individual agents in combination, results of a phase 2 study suggest.

The drug, CPX-351, is a fixed-ratio combination of cytarabine and daunorubicin inside a lipid vesicle.

In older patients with acute myeloid leukemia (AML), CPX-351 elicited a higher response rate than combination treatment with cytarabine and daunorubicin, although the difference was not significant.

Likewise, there were no significant differences in event-free survival (EFS) or overall survival (OS) between the 2 treatment groups.

However, CPX-351 conferred a significant response benefit among patients with poor cytogenetics and a significant survival benefit in patients with secondary AML (sAML).

Jeffrey Lancet, MD, of the Moffitt Cancer Center in Tampa, Florida, and his colleagues reported these results in Blood. The study was funded by Celator Pharmaceuticals, the company developing CPX-351.

Treatment details

The researchers analyzed 126 newly diagnosed AML patients who were 60 to 75 years of age.

Patients were randomized to receive CPX-351 (n=85) or “control” treatment consisting of cytarabine and daunorubicin (n=41). The 2 treatment groups were well-balanced for disease and patient characteristics at baseline.

As induction, patients in the CPX-351 arm received a 90-minute infusion of the drug at 100 units/m2 on days 1, 3, and 5 (delivering 100 mg/m2 cytarabine and 44 mg/m2 daunorubicin with each dose). Second induction and consolidation courses were given at 100 units/m2 on days 1 and 3.

Patients in the control arm received induction therapy consisting of cytarabine at 100 mg/m2/day by 7-day continuous infusion and daunorubicin at 60 mg/m2/day on days 1, 2, and 3. Daunorubicin could be reduced to 45 mg/m2/day at the investigator’s discretion for patients with advanced age, poor performance status, or reduced liver/kidney function.

The choice of consolidation therapy was at the investigator’s discretion as well. The recommended regimens included cytarabine at 100 to 200 mg/m2 for 5 to 7 days, with or without daunorubicin or intermediate-dose cytarabine (1.0 to 1.5 g/m2/dose).

Response and survival

The response rate was higher in the CPX-351 arm than in the control arm—66.7% and 51.2%, respectively (P=0.07), which met the predefined criterion for success (P<0.1). Response was defined as a complete response (CR) or a complete response with incomplete blood count recovery (CRi).

CRs occurred in 48.8% of patients in both arms. But CRis favored the CPX-351 arm over the control arm—17.9% and 2.4%, respectively.

Likewise, response rates favoring CPX-351 occurred in patients with adverse cytogenetics and sAML.

Among patients with adverse cytogenetics, the response rate was 77.3% in the CPX-351 arm and 38.5% in the control arm (P=0.03). And among patients with sAML, the response rate was 57.6% in the CPX-351 arm and 31.6% in the control arm (P=0.06).

The median OS was 14.7 months in the CPX-351 arm and 12.9 months in the control arm. The median EFS was 6.5 months and 2.0 months, respectively. These differences were not statistically significant.

However, sAML patients treated with CPX-351 had significantly better OS than sAML patients in the control arm. The median OS was 12.1 months and 6.1 months, respectively (P=0.01). And the median EFS was 4.5 months and 1.3 months, respectively (P=0.08).

Safety results

By day 60, 4.7% of patients in the CPX-351 arm and 14.6% of patients in the control arm had died. All of these deaths occurred in high-risk patients, particularly those with sAML.

Two patients died of intracranial hemorrhage during CPX-351 consolidation. One of these deaths was associated with head trauma and relapsed AML, and the other was from chemotherapy-induced thrombocytopenia.

 

 

For many of the most common adverse events, there were minimal differences between the treatment arms. These events included febrile neutropenia, infection, rash, diarrhea, nausea, edema, and constipation.

Patients in the CPX-351 arm had a higher incidence of grade 3-4 infection than controls—70.6% and 43.9%, respectively—but not infection-related deaths—3.5% and 7.3%, respectively.

The median time to neutrophil recovery (to ≥ 1000/μL) was longer in the CPX-351 arm than the control arm—36 days and 32 days, respectively. The same was true for platelet recovery (to ≥ 100,000/μL)—37 days and 28 days, respectively.

Researchers are now conducting a phase 3 trial of CPX-351, which is open and recruiting patients.

Publications
Publications
Topics
Article Type
Display Headline
Drug confers benefits for subset of AML patients
Display Headline
Drug confers benefits for subset of AML patients
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Team reprograms blood cells into HSCs in mice

Article Type
Changed
Fri, 04/25/2014 - 05:00
Display Headline
Team reprograms blood cells into HSCs in mice

Lab mouse

Researchers have found a way to reprogram mature blood cells from mice into hematopoietic stem cells (HSCs), according to a paper published in Cell.

The team used 8 transcription factors to reprogram blood progenitor cells and mature mouse myeloid cells into HSCs.

These cells, called induced HSCs (iHSCs), have the functional hallmarks of natural HSCs, are able to self-renew like natural HSCs, and can give rise to all of the cellular components of the blood.

“Blood cell production invariably goes in one direction—from stem cells, to progenitors, to mature effector cells,” said study author Derrick J. Rossi, PhD, of Boston Children’s Hospital in Massachusetts.

“We wanted to reverse the process and derive HSCs from differentiated blood cells using transcription factors that we found were specific to HSCs.”

To that end, Dr Rossi and his colleagues screened gene expression in 40 different types of blood and blood progenitor cells from mice. From this screen, the team identified 36 transcription factors that are expressed in HSCs but not in the cells that arise from them.

In a series of mouse transplantation experiments, the researchers found that 6 of the 36 transcription factors—Hlf, Runx1t1, Pbx1, Lmo2, Zfp37, and Prdm5—plus 2 additional factors not originally identified in their screen—Mycn and Meis1—were sufficient to reprogram 2 kinds of blood progenitor cells—pro/pre-B cells and common myeloid progenitor cells—into iHSCs.

The team reprogrammed their source cells by exposing them to viruses containing the genes for all 8 transcription factors and a molecular switch that turned the factor genes on in the presence of doxycycline. They then transplanted the exposed cells into recipient mice and activated the genes by giving the mice doxycycline.

The resulting iHSCs were capable of generating the entire blood cell repertoire in the transplanted mice, showing they had gained the ability to differentiate into all blood lineages. Stem cells collected from those recipients were capable of reconstituting the blood of secondary transplant recipients, proving that the 8-factor cocktail could instill the capacity for self-renewal.

Taking the work a step further, the researchers treated mature mouse myeloid cells with the same 8-factor cocktail. The resulting iHSCs produced all of the blood lineages and could regenerate the blood of secondary transplant recipients.

Study author Stuart Orkin, MD, of the Dana-Farber Cancer Institute in Boston, noted that the use of mice as a kind of reactor for reprogramming marks a novel direction in HSC research.

“In the blood research field, no one has the conditions to expand HSCs in the tissue culture dish,” he said. “Instead, by letting the reprogramming occur in mice, Rossi takes advantage of the signaling and environmental cues HSCs would normally experience.”

Dr Orkin added that iHSCs are nearly indistinguishable from normal HSCs at the transcriptional level. Unfortunately, though, these findings are far from translation to the clinic.

Researchers must still ascertain the precise contribution each of the 8 transcription factors makes in the reprogramming process and determine whether approaches that do not rely on viruses and transcription factors can have similar success.

In addition, studies are needed to test whether these results can be achieved using human cells and if other, non-blood cells can be reprogrammed to iHSCs.

Publications
Topics

Lab mouse

Researchers have found a way to reprogram mature blood cells from mice into hematopoietic stem cells (HSCs), according to a paper published in Cell.

The team used 8 transcription factors to reprogram blood progenitor cells and mature mouse myeloid cells into HSCs.

These cells, called induced HSCs (iHSCs), have the functional hallmarks of natural HSCs, are able to self-renew like natural HSCs, and can give rise to all of the cellular components of the blood.

“Blood cell production invariably goes in one direction—from stem cells, to progenitors, to mature effector cells,” said study author Derrick J. Rossi, PhD, of Boston Children’s Hospital in Massachusetts.

“We wanted to reverse the process and derive HSCs from differentiated blood cells using transcription factors that we found were specific to HSCs.”

To that end, Dr Rossi and his colleagues screened gene expression in 40 different types of blood and blood progenitor cells from mice. From this screen, the team identified 36 transcription factors that are expressed in HSCs but not in the cells that arise from them.

In a series of mouse transplantation experiments, the researchers found that 6 of the 36 transcription factors—Hlf, Runx1t1, Pbx1, Lmo2, Zfp37, and Prdm5—plus 2 additional factors not originally identified in their screen—Mycn and Meis1—were sufficient to reprogram 2 kinds of blood progenitor cells—pro/pre-B cells and common myeloid progenitor cells—into iHSCs.

The team reprogrammed their source cells by exposing them to viruses containing the genes for all 8 transcription factors and a molecular switch that turned the factor genes on in the presence of doxycycline. They then transplanted the exposed cells into recipient mice and activated the genes by giving the mice doxycycline.

The resulting iHSCs were capable of generating the entire blood cell repertoire in the transplanted mice, showing they had gained the ability to differentiate into all blood lineages. Stem cells collected from those recipients were capable of reconstituting the blood of secondary transplant recipients, proving that the 8-factor cocktail could instill the capacity for self-renewal.

Taking the work a step further, the researchers treated mature mouse myeloid cells with the same 8-factor cocktail. The resulting iHSCs produced all of the blood lineages and could regenerate the blood of secondary transplant recipients.

Study author Stuart Orkin, MD, of the Dana-Farber Cancer Institute in Boston, noted that the use of mice as a kind of reactor for reprogramming marks a novel direction in HSC research.

“In the blood research field, no one has the conditions to expand HSCs in the tissue culture dish,” he said. “Instead, by letting the reprogramming occur in mice, Rossi takes advantage of the signaling and environmental cues HSCs would normally experience.”

Dr Orkin added that iHSCs are nearly indistinguishable from normal HSCs at the transcriptional level. Unfortunately, though, these findings are far from translation to the clinic.

Researchers must still ascertain the precise contribution each of the 8 transcription factors makes in the reprogramming process and determine whether approaches that do not rely on viruses and transcription factors can have similar success.

In addition, studies are needed to test whether these results can be achieved using human cells and if other, non-blood cells can be reprogrammed to iHSCs.

Lab mouse

Researchers have found a way to reprogram mature blood cells from mice into hematopoietic stem cells (HSCs), according to a paper published in Cell.

The team used 8 transcription factors to reprogram blood progenitor cells and mature mouse myeloid cells into HSCs.

These cells, called induced HSCs (iHSCs), have the functional hallmarks of natural HSCs, are able to self-renew like natural HSCs, and can give rise to all of the cellular components of the blood.

“Blood cell production invariably goes in one direction—from stem cells, to progenitors, to mature effector cells,” said study author Derrick J. Rossi, PhD, of Boston Children’s Hospital in Massachusetts.

“We wanted to reverse the process and derive HSCs from differentiated blood cells using transcription factors that we found were specific to HSCs.”

To that end, Dr Rossi and his colleagues screened gene expression in 40 different types of blood and blood progenitor cells from mice. From this screen, the team identified 36 transcription factors that are expressed in HSCs but not in the cells that arise from them.

In a series of mouse transplantation experiments, the researchers found that 6 of the 36 transcription factors—Hlf, Runx1t1, Pbx1, Lmo2, Zfp37, and Prdm5—plus 2 additional factors not originally identified in their screen—Mycn and Meis1—were sufficient to reprogram 2 kinds of blood progenitor cells—pro/pre-B cells and common myeloid progenitor cells—into iHSCs.

The team reprogrammed their source cells by exposing them to viruses containing the genes for all 8 transcription factors and a molecular switch that turned the factor genes on in the presence of doxycycline. They then transplanted the exposed cells into recipient mice and activated the genes by giving the mice doxycycline.

The resulting iHSCs were capable of generating the entire blood cell repertoire in the transplanted mice, showing they had gained the ability to differentiate into all blood lineages. Stem cells collected from those recipients were capable of reconstituting the blood of secondary transplant recipients, proving that the 8-factor cocktail could instill the capacity for self-renewal.

Taking the work a step further, the researchers treated mature mouse myeloid cells with the same 8-factor cocktail. The resulting iHSCs produced all of the blood lineages and could regenerate the blood of secondary transplant recipients.

Study author Stuart Orkin, MD, of the Dana-Farber Cancer Institute in Boston, noted that the use of mice as a kind of reactor for reprogramming marks a novel direction in HSC research.

“In the blood research field, no one has the conditions to expand HSCs in the tissue culture dish,” he said. “Instead, by letting the reprogramming occur in mice, Rossi takes advantage of the signaling and environmental cues HSCs would normally experience.”

Dr Orkin added that iHSCs are nearly indistinguishable from normal HSCs at the transcriptional level. Unfortunately, though, these findings are far from translation to the clinic.

Researchers must still ascertain the precise contribution each of the 8 transcription factors makes in the reprogramming process and determine whether approaches that do not rely on viruses and transcription factors can have similar success.

In addition, studies are needed to test whether these results can be achieved using human cells and if other, non-blood cells can be reprogrammed to iHSCs.

Publications
Publications
Topics
Article Type
Display Headline
Team reprograms blood cells into HSCs in mice
Display Headline
Team reprograms blood cells into HSCs in mice
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

A new method for measuring DNA repair

Article Type
Changed
Fri, 04/25/2014 - 05:00
Display Headline
A new method for measuring DNA repair

DNA repair in action

Credit: NIGMS

Cells have several major repair systems that can fix DNA damage, which may lead to cancer and other diseases if not mended.

Unfortunately, the effectiveness of these repair systems varies greatly from person to person.

Now, researchers have developed a test that can rapidly assess several of these repair systems, which could potentially help us determine an individual’s risk of developing cancer and predict how a patient might respond to chemotherapy.

The new test, described in Proceedings of the National Academy of Sciences, can analyze 4 types of DNA repair capacity simultaneously, in less than 24 hours. Previous tests have only been able to evaluate a single system at a time.

“All of the repair pathways work differently, and the existing technology to measure each of those pathways is very different for each one,” said study author Zachary Nagel, PhD, of the Massachusetts Institute of Technology in Cambridge.

“What we wanted to do was come up with one way of measuring all DNA repair pathways at the same time so you have a single readout that’s easy to measure.”

The researchers used this approach to measure DNA repair in lymphoblastoid cells taken from 24 healthy subjects. The team found a huge range of variability, especially in one repair system, where some subjects’ cells were more than 10 times more efficient than others.

“None of the cells came out looking the same,” said study author Leona Samson, PhD, also of MIT. “They each have their own spectrum of what they can repair well and what they don’t repair well. It’s like a fingerprint for each person.”

Measuring repair

With the new test, the team can measure how well cells repair the most common DNA lesions, including single-strand breaks, double-strand breaks, mismatches, and the introduction of alkyl groups caused by pollutants such as fuel exhaust and tobacco smoke.

To achieve this, the researchers created 5 different circular pieces of DNA, 4 of which carry DNA lesions. Each of these circular DNA strands, or plasmids, also carries a gene for a different colored fluorescent protein.

In some cases, the DNA lesions prevent those genes from being expressed, so when the DNA is successfully repaired, the cell begins to produce the fluorescent protein. In others, repairing the DNA lesion turns the fluorescent gene off.

By introducing these plasmids into cells and reading the fluorescent output, scientists can determine how efficiently each kind of lesion has been repaired. In theory, more than 5 plasmids could go into each cell, but the researchers limited each experiment to 5 reporter plasmids to avoid potential overlap among colors.

To overcome that limitation, the researchers are also developing an alternative tactic that involves sequencing the messenger RNA produced by cells when they copy the plasmid genes, instead of measuring fluorescence.

In this study, the team tested the sequencing approach with just one type of DNA repair, but it could allow for unlimited tests at one time. And the researchers could customize the target DNA sequence to reveal information about which type of lesion the plasmid carries, as well as information about which patient’s cells are being tested.

This would provide the ability for many different patient samples to be tested in the same batch, making the test more cost-effective.

Making predictions

Previous studies have shown that many different types of DNA repair capacity can vary greatly among apparently healthy individuals. Some of these differences have been linked with cancer vulnerability.

Scientists have also identified links between DNA repair and neurological, developmental, and immunological disorders. But useful predictive DNA-repair-based tests have not been developed, largely because it has been impossible to rapidly analyze several different types of DNA repair capacity at once.

 

 

Dr Samson’s lab is now working on adapting the new test so it can be used with blood samples taken from patients, allowing researchers to identify patients who are at higher risk of disease and potentially enabling prevention or earlier diagnosis of diseases linked to DNA repair.

Such a test could also be used to predict a patient’s response to chemotherapy or to determine how much radiation treatment a patient can tolerate.

The researchers also believe this test could be exploited to screen for new drugs that inhibit or enhance DNA repair. Inhibitors could be targeted to tumors to make them more susceptible to chemotherapy, while enhancers could help protect people who have been accidentally exposed to DNA-damaging agents, such as radiation.

Publications
Topics

DNA repair in action

Credit: NIGMS

Cells have several major repair systems that can fix DNA damage, which may lead to cancer and other diseases if not mended.

Unfortunately, the effectiveness of these repair systems varies greatly from person to person.

Now, researchers have developed a test that can rapidly assess several of these repair systems, which could potentially help us determine an individual’s risk of developing cancer and predict how a patient might respond to chemotherapy.

The new test, described in Proceedings of the National Academy of Sciences, can analyze 4 types of DNA repair capacity simultaneously, in less than 24 hours. Previous tests have only been able to evaluate a single system at a time.

“All of the repair pathways work differently, and the existing technology to measure each of those pathways is very different for each one,” said study author Zachary Nagel, PhD, of the Massachusetts Institute of Technology in Cambridge.

“What we wanted to do was come up with one way of measuring all DNA repair pathways at the same time so you have a single readout that’s easy to measure.”

The researchers used this approach to measure DNA repair in lymphoblastoid cells taken from 24 healthy subjects. The team found a huge range of variability, especially in one repair system, where some subjects’ cells were more than 10 times more efficient than others.

“None of the cells came out looking the same,” said study author Leona Samson, PhD, also of MIT. “They each have their own spectrum of what they can repair well and what they don’t repair well. It’s like a fingerprint for each person.”

Measuring repair

With the new test, the team can measure how well cells repair the most common DNA lesions, including single-strand breaks, double-strand breaks, mismatches, and the introduction of alkyl groups caused by pollutants such as fuel exhaust and tobacco smoke.

To achieve this, the researchers created 5 different circular pieces of DNA, 4 of which carry DNA lesions. Each of these circular DNA strands, or plasmids, also carries a gene for a different colored fluorescent protein.

In some cases, the DNA lesions prevent those genes from being expressed, so when the DNA is successfully repaired, the cell begins to produce the fluorescent protein. In others, repairing the DNA lesion turns the fluorescent gene off.

By introducing these plasmids into cells and reading the fluorescent output, scientists can determine how efficiently each kind of lesion has been repaired. In theory, more than 5 plasmids could go into each cell, but the researchers limited each experiment to 5 reporter plasmids to avoid potential overlap among colors.

To overcome that limitation, the researchers are also developing an alternative tactic that involves sequencing the messenger RNA produced by cells when they copy the plasmid genes, instead of measuring fluorescence.

In this study, the team tested the sequencing approach with just one type of DNA repair, but it could allow for unlimited tests at one time. And the researchers could customize the target DNA sequence to reveal information about which type of lesion the plasmid carries, as well as information about which patient’s cells are being tested.

This would provide the ability for many different patient samples to be tested in the same batch, making the test more cost-effective.

Making predictions

Previous studies have shown that many different types of DNA repair capacity can vary greatly among apparently healthy individuals. Some of these differences have been linked with cancer vulnerability.

Scientists have also identified links between DNA repair and neurological, developmental, and immunological disorders. But useful predictive DNA-repair-based tests have not been developed, largely because it has been impossible to rapidly analyze several different types of DNA repair capacity at once.

 

 

Dr Samson’s lab is now working on adapting the new test so it can be used with blood samples taken from patients, allowing researchers to identify patients who are at higher risk of disease and potentially enabling prevention or earlier diagnosis of diseases linked to DNA repair.

Such a test could also be used to predict a patient’s response to chemotherapy or to determine how much radiation treatment a patient can tolerate.

The researchers also believe this test could be exploited to screen for new drugs that inhibit or enhance DNA repair. Inhibitors could be targeted to tumors to make them more susceptible to chemotherapy, while enhancers could help protect people who have been accidentally exposed to DNA-damaging agents, such as radiation.

DNA repair in action

Credit: NIGMS

Cells have several major repair systems that can fix DNA damage, which may lead to cancer and other diseases if not mended.

Unfortunately, the effectiveness of these repair systems varies greatly from person to person.

Now, researchers have developed a test that can rapidly assess several of these repair systems, which could potentially help us determine an individual’s risk of developing cancer and predict how a patient might respond to chemotherapy.

The new test, described in Proceedings of the National Academy of Sciences, can analyze 4 types of DNA repair capacity simultaneously, in less than 24 hours. Previous tests have only been able to evaluate a single system at a time.

“All of the repair pathways work differently, and the existing technology to measure each of those pathways is very different for each one,” said study author Zachary Nagel, PhD, of the Massachusetts Institute of Technology in Cambridge.

“What we wanted to do was come up with one way of measuring all DNA repair pathways at the same time so you have a single readout that’s easy to measure.”

The researchers used this approach to measure DNA repair in lymphoblastoid cells taken from 24 healthy subjects. The team found a huge range of variability, especially in one repair system, where some subjects’ cells were more than 10 times more efficient than others.

“None of the cells came out looking the same,” said study author Leona Samson, PhD, also of MIT. “They each have their own spectrum of what they can repair well and what they don’t repair well. It’s like a fingerprint for each person.”

Measuring repair

With the new test, the team can measure how well cells repair the most common DNA lesions, including single-strand breaks, double-strand breaks, mismatches, and the introduction of alkyl groups caused by pollutants such as fuel exhaust and tobacco smoke.

To achieve this, the researchers created 5 different circular pieces of DNA, 4 of which carry DNA lesions. Each of these circular DNA strands, or plasmids, also carries a gene for a different colored fluorescent protein.

In some cases, the DNA lesions prevent those genes from being expressed, so when the DNA is successfully repaired, the cell begins to produce the fluorescent protein. In others, repairing the DNA lesion turns the fluorescent gene off.

By introducing these plasmids into cells and reading the fluorescent output, scientists can determine how efficiently each kind of lesion has been repaired. In theory, more than 5 plasmids could go into each cell, but the researchers limited each experiment to 5 reporter plasmids to avoid potential overlap among colors.

To overcome that limitation, the researchers are also developing an alternative tactic that involves sequencing the messenger RNA produced by cells when they copy the plasmid genes, instead of measuring fluorescence.

In this study, the team tested the sequencing approach with just one type of DNA repair, but it could allow for unlimited tests at one time. And the researchers could customize the target DNA sequence to reveal information about which type of lesion the plasmid carries, as well as information about which patient’s cells are being tested.

This would provide the ability for many different patient samples to be tested in the same batch, making the test more cost-effective.

Making predictions

Previous studies have shown that many different types of DNA repair capacity can vary greatly among apparently healthy individuals. Some of these differences have been linked with cancer vulnerability.

Scientists have also identified links between DNA repair and neurological, developmental, and immunological disorders. But useful predictive DNA-repair-based tests have not been developed, largely because it has been impossible to rapidly analyze several different types of DNA repair capacity at once.

 

 

Dr Samson’s lab is now working on adapting the new test so it can be used with blood samples taken from patients, allowing researchers to identify patients who are at higher risk of disease and potentially enabling prevention or earlier diagnosis of diseases linked to DNA repair.

Such a test could also be used to predict a patient’s response to chemotherapy or to determine how much radiation treatment a patient can tolerate.

The researchers also believe this test could be exploited to screen for new drugs that inhibit or enhance DNA repair. Inhibitors could be targeted to tumors to make them more susceptible to chemotherapy, while enhancers could help protect people who have been accidentally exposed to DNA-damaging agents, such as radiation.

Publications
Publications
Topics
Article Type
Display Headline
A new method for measuring DNA repair
Display Headline
A new method for measuring DNA repair
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Group maps B-cell development

Article Type
Changed
Fri, 04/25/2014 - 05:00
Display Headline
Group maps B-cell development

B-cell lymphoma

New technology has allowed scientists to create the most comprehensive map of B-cell development to date, according to a paper published in Cell.

The team combined emerging technologies for studying single cells with an advanced computational algorithm to map human B-cell development.

They believe their approach could improve researchers’ ability to investigate development in all cells and make it possible to identify rare aberrations that lead to disease.

“There are so many diseases that result from malfunctions in the molecular programs that control the development of our cell repertoire and so many rare, yet important, regulatory cell types that we have yet to discover,” said study author Dana Pe’er, PhD, of Columbia University in New York.

“We can only truly understand what goes wrong in these diseases if we have a complete map of the progression in normal development.”

Combining technologies

Dr Pe’er and her colleagues used mass cytology to observe cells in a bone marrow sample. In a single experiment, mass cytology can measure 44 molecular markers simultaneously in millions of individual cells. This provides data that can be used to compare, categorize, and order cells, as well as identify the molecular systems responsible for development.

Taking advantage of this data required the researchers to develop new mathematical and computational methods for interpreting it. Just as one can represent a physical object in 3 dimensions, the Pe’er lab’s approach involved thinking of the 44 measurements as a 44-dimensional geometric object.

So they created a new computational algorithm called Wanderlust, which uses mathematical concepts from a field called graph theory to reduce this high-dimensional data into a simple form that is easier to interpret. Wanderlust converts the developmental marker measurements in each cell into a single, 1-dimensional value that corresponds to the cell’s place within the chronology of development.

“Our body has trillions of cells of countless different types, each type bearing different molecular features and behavior,” Dr Pe’er noted. “This complexity expands from a single cell in a carefully regulated process called development.”

“This regulation creates patterns and shapes in the high-dimensional data we measure. By using Wanderlust to analyze these data, we can find the pattern and trace the trajectory that cellular development follows.”

Mapping B-cell development

To test their approach, the researchers studied development in human B cells. The team used mass cytometry to profile 44 markers in a cohort of approximately 200,000 healthy immune cells that were gathered from a single bone marrow sample.

In each cell, they measured surface markers that help identify cell type, as well as markers inside the cell that can reveal what the cell is doing, including markers for signaling, the cell cycle, apoptosis, and genome rearrangement.

Using Wanderlust to analyze the high-dimensional data provided by mass cytometry, the researchers accurately ordered the entire trajectory of 200,000 cells according to their developmental chronology. Wanderlust captured and correctly ordered all of the primary molecular landmarks known to be present in human B-cell development.

The algorithm also pinpointed a number of previously unknown regulatory signaling checkpoints that are required for human B-cell development, as well as uncharacterized subtypes of B-cell progenitors that correspond to developmental stages.

The researchers identified rare, previously unknown signaling events involving STAT5 that occurred in just 7 out of 10,000 cells. The team found that disrupting these signaling events using kinase inhibitors fully stalled the development of B cells.

Identifying and characterizing the regulatory checkpoints that control and monitor cell fate can have many practical applications, the researchers said, including the development of new diagnostics and therapeutics.

 

 

Furthermore, the team’s mapping process can be applied to any type of cell. They believe their method offers the possibility of studying normal development as well as the processes responsible for any kind of developmental disease.

“This current project is a landmark, both in the study of development and in single-cell research, and has completely changed the way I think about science,” Dr Pe’er said. “A fire has been lit, and these findings are just the tip of the iceberg of what is now possible.”

Publications
Topics

B-cell lymphoma

New technology has allowed scientists to create the most comprehensive map of B-cell development to date, according to a paper published in Cell.

The team combined emerging technologies for studying single cells with an advanced computational algorithm to map human B-cell development.

They believe their approach could improve researchers’ ability to investigate development in all cells and make it possible to identify rare aberrations that lead to disease.

“There are so many diseases that result from malfunctions in the molecular programs that control the development of our cell repertoire and so many rare, yet important, regulatory cell types that we have yet to discover,” said study author Dana Pe’er, PhD, of Columbia University in New York.

“We can only truly understand what goes wrong in these diseases if we have a complete map of the progression in normal development.”

Combining technologies

Dr Pe’er and her colleagues used mass cytology to observe cells in a bone marrow sample. In a single experiment, mass cytology can measure 44 molecular markers simultaneously in millions of individual cells. This provides data that can be used to compare, categorize, and order cells, as well as identify the molecular systems responsible for development.

Taking advantage of this data required the researchers to develop new mathematical and computational methods for interpreting it. Just as one can represent a physical object in 3 dimensions, the Pe’er lab’s approach involved thinking of the 44 measurements as a 44-dimensional geometric object.

So they created a new computational algorithm called Wanderlust, which uses mathematical concepts from a field called graph theory to reduce this high-dimensional data into a simple form that is easier to interpret. Wanderlust converts the developmental marker measurements in each cell into a single, 1-dimensional value that corresponds to the cell’s place within the chronology of development.

“Our body has trillions of cells of countless different types, each type bearing different molecular features and behavior,” Dr Pe’er noted. “This complexity expands from a single cell in a carefully regulated process called development.”

“This regulation creates patterns and shapes in the high-dimensional data we measure. By using Wanderlust to analyze these data, we can find the pattern and trace the trajectory that cellular development follows.”

Mapping B-cell development

To test their approach, the researchers studied development in human B cells. The team used mass cytometry to profile 44 markers in a cohort of approximately 200,000 healthy immune cells that were gathered from a single bone marrow sample.

In each cell, they measured surface markers that help identify cell type, as well as markers inside the cell that can reveal what the cell is doing, including markers for signaling, the cell cycle, apoptosis, and genome rearrangement.

Using Wanderlust to analyze the high-dimensional data provided by mass cytometry, the researchers accurately ordered the entire trajectory of 200,000 cells according to their developmental chronology. Wanderlust captured and correctly ordered all of the primary molecular landmarks known to be present in human B-cell development.

The algorithm also pinpointed a number of previously unknown regulatory signaling checkpoints that are required for human B-cell development, as well as uncharacterized subtypes of B-cell progenitors that correspond to developmental stages.

The researchers identified rare, previously unknown signaling events involving STAT5 that occurred in just 7 out of 10,000 cells. The team found that disrupting these signaling events using kinase inhibitors fully stalled the development of B cells.

Identifying and characterizing the regulatory checkpoints that control and monitor cell fate can have many practical applications, the researchers said, including the development of new diagnostics and therapeutics.

 

 

Furthermore, the team’s mapping process can be applied to any type of cell. They believe their method offers the possibility of studying normal development as well as the processes responsible for any kind of developmental disease.

“This current project is a landmark, both in the study of development and in single-cell research, and has completely changed the way I think about science,” Dr Pe’er said. “A fire has been lit, and these findings are just the tip of the iceberg of what is now possible.”

B-cell lymphoma

New technology has allowed scientists to create the most comprehensive map of B-cell development to date, according to a paper published in Cell.

The team combined emerging technologies for studying single cells with an advanced computational algorithm to map human B-cell development.

They believe their approach could improve researchers’ ability to investigate development in all cells and make it possible to identify rare aberrations that lead to disease.

“There are so many diseases that result from malfunctions in the molecular programs that control the development of our cell repertoire and so many rare, yet important, regulatory cell types that we have yet to discover,” said study author Dana Pe’er, PhD, of Columbia University in New York.

“We can only truly understand what goes wrong in these diseases if we have a complete map of the progression in normal development.”

Combining technologies

Dr Pe’er and her colleagues used mass cytology to observe cells in a bone marrow sample. In a single experiment, mass cytology can measure 44 molecular markers simultaneously in millions of individual cells. This provides data that can be used to compare, categorize, and order cells, as well as identify the molecular systems responsible for development.

Taking advantage of this data required the researchers to develop new mathematical and computational methods for interpreting it. Just as one can represent a physical object in 3 dimensions, the Pe’er lab’s approach involved thinking of the 44 measurements as a 44-dimensional geometric object.

So they created a new computational algorithm called Wanderlust, which uses mathematical concepts from a field called graph theory to reduce this high-dimensional data into a simple form that is easier to interpret. Wanderlust converts the developmental marker measurements in each cell into a single, 1-dimensional value that corresponds to the cell’s place within the chronology of development.

“Our body has trillions of cells of countless different types, each type bearing different molecular features and behavior,” Dr Pe’er noted. “This complexity expands from a single cell in a carefully regulated process called development.”

“This regulation creates patterns and shapes in the high-dimensional data we measure. By using Wanderlust to analyze these data, we can find the pattern and trace the trajectory that cellular development follows.”

Mapping B-cell development

To test their approach, the researchers studied development in human B cells. The team used mass cytometry to profile 44 markers in a cohort of approximately 200,000 healthy immune cells that were gathered from a single bone marrow sample.

In each cell, they measured surface markers that help identify cell type, as well as markers inside the cell that can reveal what the cell is doing, including markers for signaling, the cell cycle, apoptosis, and genome rearrangement.

Using Wanderlust to analyze the high-dimensional data provided by mass cytometry, the researchers accurately ordered the entire trajectory of 200,000 cells according to their developmental chronology. Wanderlust captured and correctly ordered all of the primary molecular landmarks known to be present in human B-cell development.

The algorithm also pinpointed a number of previously unknown regulatory signaling checkpoints that are required for human B-cell development, as well as uncharacterized subtypes of B-cell progenitors that correspond to developmental stages.

The researchers identified rare, previously unknown signaling events involving STAT5 that occurred in just 7 out of 10,000 cells. The team found that disrupting these signaling events using kinase inhibitors fully stalled the development of B cells.

Identifying and characterizing the regulatory checkpoints that control and monitor cell fate can have many practical applications, the researchers said, including the development of new diagnostics and therapeutics.

 

 

Furthermore, the team’s mapping process can be applied to any type of cell. They believe their method offers the possibility of studying normal development as well as the processes responsible for any kind of developmental disease.

“This current project is a landmark, both in the study of development and in single-cell research, and has completely changed the way I think about science,” Dr Pe’er said. “A fire has been lit, and these findings are just the tip of the iceberg of what is now possible.”

Publications
Publications
Topics
Article Type
Display Headline
Group maps B-cell development
Display Headline
Group maps B-cell development
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Better Medication Adherence with Intervention; Clinical Outcomes Unchanged

Article Type
Changed
Fri, 09/14/2018 - 12:14
Display Headline
Better Medication Adherence with Intervention; Clinical Outcomes Unchanged

Clinical question

Does an intervention consisting of increased pharmacist involvement and education increase long-term medication adherence in patients after hospitalization for acute coronary syndrome?

Bottom line

Following hospitalization for acute coronary syndrome (ACS), an intervention that emphasizes medication reconciliation, pharmacist-led education, collaboration between pharmacists and physicians, and automated reminders increases patients’ adherence to cardiac medications. However, there was no significant effect seen on clinical outcomes after 1 year. (LOE = 1b)

Reference

Ho PM, Lambert-Kerzner A, Carey EP, et al. Multifaceted intervention to improve medication adherence and secondary prevention measures after acute coronary syndrome hospital discharge: A randomized clinical trial. JAMA Intern Med 2014;174(2):186-193.

Study design

Randomized controlled trial (nonblinded)

Funding source

Government

Allocation

Concealed

Setting

Inpatient (any location) with outpatient follow-up

Synopsis

In this study performed at 4 Veteran Affairs (VA) medical centers, investigators enrolled 253 patients who were hospitalized with a primary diagnosis of ACS, had an anticipated discharge to home, and used the VA as their primary source of medical and pharmaceutical care. Using concealed allocation, patients were randomized to receive usual care or the intervention. Both groups received standard discharge instructions, discharge medication lists, and education on cardiac medications prior to discharge. The intervention group also received the following: (1) two sessions of medication reconciliation and education by a pharmacist within one month of discharge; (2) automated educational voice messages about medications, as well as access to pharmacists upon request throughout the study; (3) collaboration between the patient’s primary care physician and/or cardiologist; and (4) regular voice messages with reminders to take medications and refill medications for the remainder of the year. All patients were scheduled for a 12-month clinic visit. Baseline characteristics of the 2 groups were similar (mean age = 64 years; all but 5 of the patients were men). Patients in the intervention group received an average of 4 hours of additional pharmacist time. For the primary outcome of adherence to 4 classes of cardioprotective medications (beta-blockers, statins, clopidogrel, and angiotensin-converting enzyme inhibitors or angiotensin receptor blockers), more patients in the intervention group were adherent compared with the usual care group (89% vs 74%; P = .003). The high adherence in the usual care group reflects a self-selection bias since enrolled patients were those who had volunteered for the study. Despite greater medication adherence in the intervention group, there were no significant differences in the proportion of patients reaching blood pressure and LDL cholesterol goals between the 2 groups. Additionally, tertiary outcomes -- including rehospitalization for myocardial infarction, revascularization, and mortality -- were similar in the 2 groups. The estimated cost for the intervention was approximately $360 per patient, mainly because of additional pharmacist and cardiologist time. No significant differences in costs due to medication prescriptions were noted in the study.

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Issue
The Hospitalist - 2014(04)
Publications
Sections

Clinical question

Does an intervention consisting of increased pharmacist involvement and education increase long-term medication adherence in patients after hospitalization for acute coronary syndrome?

Bottom line

Following hospitalization for acute coronary syndrome (ACS), an intervention that emphasizes medication reconciliation, pharmacist-led education, collaboration between pharmacists and physicians, and automated reminders increases patients’ adherence to cardiac medications. However, there was no significant effect seen on clinical outcomes after 1 year. (LOE = 1b)

Reference

Ho PM, Lambert-Kerzner A, Carey EP, et al. Multifaceted intervention to improve medication adherence and secondary prevention measures after acute coronary syndrome hospital discharge: A randomized clinical trial. JAMA Intern Med 2014;174(2):186-193.

Study design

Randomized controlled trial (nonblinded)

Funding source

Government

Allocation

Concealed

Setting

Inpatient (any location) with outpatient follow-up

Synopsis

In this study performed at 4 Veteran Affairs (VA) medical centers, investigators enrolled 253 patients who were hospitalized with a primary diagnosis of ACS, had an anticipated discharge to home, and used the VA as their primary source of medical and pharmaceutical care. Using concealed allocation, patients were randomized to receive usual care or the intervention. Both groups received standard discharge instructions, discharge medication lists, and education on cardiac medications prior to discharge. The intervention group also received the following: (1) two sessions of medication reconciliation and education by a pharmacist within one month of discharge; (2) automated educational voice messages about medications, as well as access to pharmacists upon request throughout the study; (3) collaboration between the patient’s primary care physician and/or cardiologist; and (4) regular voice messages with reminders to take medications and refill medications for the remainder of the year. All patients were scheduled for a 12-month clinic visit. Baseline characteristics of the 2 groups were similar (mean age = 64 years; all but 5 of the patients were men). Patients in the intervention group received an average of 4 hours of additional pharmacist time. For the primary outcome of adherence to 4 classes of cardioprotective medications (beta-blockers, statins, clopidogrel, and angiotensin-converting enzyme inhibitors or angiotensin receptor blockers), more patients in the intervention group were adherent compared with the usual care group (89% vs 74%; P = .003). The high adherence in the usual care group reflects a self-selection bias since enrolled patients were those who had volunteered for the study. Despite greater medication adherence in the intervention group, there were no significant differences in the proportion of patients reaching blood pressure and LDL cholesterol goals between the 2 groups. Additionally, tertiary outcomes -- including rehospitalization for myocardial infarction, revascularization, and mortality -- were similar in the 2 groups. The estimated cost for the intervention was approximately $360 per patient, mainly because of additional pharmacist and cardiologist time. No significant differences in costs due to medication prescriptions were noted in the study.

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Clinical question

Does an intervention consisting of increased pharmacist involvement and education increase long-term medication adherence in patients after hospitalization for acute coronary syndrome?

Bottom line

Following hospitalization for acute coronary syndrome (ACS), an intervention that emphasizes medication reconciliation, pharmacist-led education, collaboration between pharmacists and physicians, and automated reminders increases patients’ adherence to cardiac medications. However, there was no significant effect seen on clinical outcomes after 1 year. (LOE = 1b)

Reference

Ho PM, Lambert-Kerzner A, Carey EP, et al. Multifaceted intervention to improve medication adherence and secondary prevention measures after acute coronary syndrome hospital discharge: A randomized clinical trial. JAMA Intern Med 2014;174(2):186-193.

Study design

Randomized controlled trial (nonblinded)

Funding source

Government

Allocation

Concealed

Setting

Inpatient (any location) with outpatient follow-up

Synopsis

In this study performed at 4 Veteran Affairs (VA) medical centers, investigators enrolled 253 patients who were hospitalized with a primary diagnosis of ACS, had an anticipated discharge to home, and used the VA as their primary source of medical and pharmaceutical care. Using concealed allocation, patients were randomized to receive usual care or the intervention. Both groups received standard discharge instructions, discharge medication lists, and education on cardiac medications prior to discharge. The intervention group also received the following: (1) two sessions of medication reconciliation and education by a pharmacist within one month of discharge; (2) automated educational voice messages about medications, as well as access to pharmacists upon request throughout the study; (3) collaboration between the patient’s primary care physician and/or cardiologist; and (4) regular voice messages with reminders to take medications and refill medications for the remainder of the year. All patients were scheduled for a 12-month clinic visit. Baseline characteristics of the 2 groups were similar (mean age = 64 years; all but 5 of the patients were men). Patients in the intervention group received an average of 4 hours of additional pharmacist time. For the primary outcome of adherence to 4 classes of cardioprotective medications (beta-blockers, statins, clopidogrel, and angiotensin-converting enzyme inhibitors or angiotensin receptor blockers), more patients in the intervention group were adherent compared with the usual care group (89% vs 74%; P = .003). The high adherence in the usual care group reflects a self-selection bias since enrolled patients were those who had volunteered for the study. Despite greater medication adherence in the intervention group, there were no significant differences in the proportion of patients reaching blood pressure and LDL cholesterol goals between the 2 groups. Additionally, tertiary outcomes -- including rehospitalization for myocardial infarction, revascularization, and mortality -- were similar in the 2 groups. The estimated cost for the intervention was approximately $360 per patient, mainly because of additional pharmacist and cardiologist time. No significant differences in costs due to medication prescriptions were noted in the study.

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Issue
The Hospitalist - 2014(04)
Issue
The Hospitalist - 2014(04)
Publications
Publications
Article Type
Display Headline
Better Medication Adherence with Intervention; Clinical Outcomes Unchanged
Display Headline
Better Medication Adherence with Intervention; Clinical Outcomes Unchanged
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Improved Mortality with CABG vs PCI in Multivessel Disease

Article Type
Changed
Fri, 09/14/2018 - 12:14
Display Headline
Improved Mortality with CABG vs PCI in Multivessel Disease

Clinical question

For patients with multivessel disease, which is the better approach for reducing long-term mortality: coronary artery bypass grafting or percutaneous coronary intervention?

Bottom line

When compared with percutaneous coronary intervention (PCI), coronary artery bypass grafting (CABG) reduces overall mortality and myocardial infarctions (MIs) in patients with multivessel disease. You would need to treat 37 patients with CABG to prevent one death, and 26 patients with CABG to prevent one MI over an average follow-up of 4 years. This is compared with a number needed to treat to harm of 105 to cause one additional stroke with CABG. (LOE = 1a)

Reference

Sipahi I, Akay H, Dagdelen S, Blitz A, Alhan C. Coronary artery bypass grafting vs percutaneous coronary intervention and long-term mortality and morbidity in multivessel disease: meta-analysis of randomized clinical trials of the arterial grafting and stenting era. JAMA Intern Med 2014;174(2):223-230.

Study design

Meta-analysis (randomized controlled trials)

Funding source

Unknown/not stated

Allocation

Uncertain

Setting

Various (meta-analysis)

Synopsis

Existing trials that compare CABG with PCI are underpowered to detect a difference in long-term mortality or MI. To study these outcomes, these investigators searched multiple databases, including MEDLINE and the Cochrane Central Register of Controlled Trials, to find randomized controlled trials that compared the 2 approaches over an average follow-up of at least 1 year in patients with multivessel disease. To ensure that these trials reflected contemporary practice, the authors only included trials in which arterial grafts were used in 90% of the CABG cases and trials in which stents were used in 70% of the PCI cases. Two investigators independently extracted data from the 6 included trials. No formal quality assessment was performed. No publication bias was detected. When taken together, data from the 6 trials (N = 6055) showed that the use of CABG as compared with PCI resulted in a 27% reduction in mortality (relative risk [RR] = 0.73; 95% CI, 0.62-0.86) and a 42% reduction in MI (RR = 0.58; 0.48-0.72). Although there was a nonsignificant trend toward increased strokes in the CABG group (likely related to periprocedural events), this approach also led to fewer repeat revascularizations (number needed to treat [NNT] = 7), as well as fewer overall major adverse cardiac and cerebrovascular events (NNT = 10).

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Issue
The Hospitalist - 2014(04)
Publications
Sections

Clinical question

For patients with multivessel disease, which is the better approach for reducing long-term mortality: coronary artery bypass grafting or percutaneous coronary intervention?

Bottom line

When compared with percutaneous coronary intervention (PCI), coronary artery bypass grafting (CABG) reduces overall mortality and myocardial infarctions (MIs) in patients with multivessel disease. You would need to treat 37 patients with CABG to prevent one death, and 26 patients with CABG to prevent one MI over an average follow-up of 4 years. This is compared with a number needed to treat to harm of 105 to cause one additional stroke with CABG. (LOE = 1a)

Reference

Sipahi I, Akay H, Dagdelen S, Blitz A, Alhan C. Coronary artery bypass grafting vs percutaneous coronary intervention and long-term mortality and morbidity in multivessel disease: meta-analysis of randomized clinical trials of the arterial grafting and stenting era. JAMA Intern Med 2014;174(2):223-230.

Study design

Meta-analysis (randomized controlled trials)

Funding source

Unknown/not stated

Allocation

Uncertain

Setting

Various (meta-analysis)

Synopsis

Existing trials that compare CABG with PCI are underpowered to detect a difference in long-term mortality or MI. To study these outcomes, these investigators searched multiple databases, including MEDLINE and the Cochrane Central Register of Controlled Trials, to find randomized controlled trials that compared the 2 approaches over an average follow-up of at least 1 year in patients with multivessel disease. To ensure that these trials reflected contemporary practice, the authors only included trials in which arterial grafts were used in 90% of the CABG cases and trials in which stents were used in 70% of the PCI cases. Two investigators independently extracted data from the 6 included trials. No formal quality assessment was performed. No publication bias was detected. When taken together, data from the 6 trials (N = 6055) showed that the use of CABG as compared with PCI resulted in a 27% reduction in mortality (relative risk [RR] = 0.73; 95% CI, 0.62-0.86) and a 42% reduction in MI (RR = 0.58; 0.48-0.72). Although there was a nonsignificant trend toward increased strokes in the CABG group (likely related to periprocedural events), this approach also led to fewer repeat revascularizations (number needed to treat [NNT] = 7), as well as fewer overall major adverse cardiac and cerebrovascular events (NNT = 10).

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Clinical question

For patients with multivessel disease, which is the better approach for reducing long-term mortality: coronary artery bypass grafting or percutaneous coronary intervention?

Bottom line

When compared with percutaneous coronary intervention (PCI), coronary artery bypass grafting (CABG) reduces overall mortality and myocardial infarctions (MIs) in patients with multivessel disease. You would need to treat 37 patients with CABG to prevent one death, and 26 patients with CABG to prevent one MI over an average follow-up of 4 years. This is compared with a number needed to treat to harm of 105 to cause one additional stroke with CABG. (LOE = 1a)

Reference

Sipahi I, Akay H, Dagdelen S, Blitz A, Alhan C. Coronary artery bypass grafting vs percutaneous coronary intervention and long-term mortality and morbidity in multivessel disease: meta-analysis of randomized clinical trials of the arterial grafting and stenting era. JAMA Intern Med 2014;174(2):223-230.

Study design

Meta-analysis (randomized controlled trials)

Funding source

Unknown/not stated

Allocation

Uncertain

Setting

Various (meta-analysis)

Synopsis

Existing trials that compare CABG with PCI are underpowered to detect a difference in long-term mortality or MI. To study these outcomes, these investigators searched multiple databases, including MEDLINE and the Cochrane Central Register of Controlled Trials, to find randomized controlled trials that compared the 2 approaches over an average follow-up of at least 1 year in patients with multivessel disease. To ensure that these trials reflected contemporary practice, the authors only included trials in which arterial grafts were used in 90% of the CABG cases and trials in which stents were used in 70% of the PCI cases. Two investigators independently extracted data from the 6 included trials. No formal quality assessment was performed. No publication bias was detected. When taken together, data from the 6 trials (N = 6055) showed that the use of CABG as compared with PCI resulted in a 27% reduction in mortality (relative risk [RR] = 0.73; 95% CI, 0.62-0.86) and a 42% reduction in MI (RR = 0.58; 0.48-0.72). Although there was a nonsignificant trend toward increased strokes in the CABG group (likely related to periprocedural events), this approach also led to fewer repeat revascularizations (number needed to treat [NNT] = 7), as well as fewer overall major adverse cardiac and cerebrovascular events (NNT = 10).

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Issue
The Hospitalist - 2014(04)
Issue
The Hospitalist - 2014(04)
Publications
Publications
Article Type
Display Headline
Improved Mortality with CABG vs PCI in Multivessel Disease
Display Headline
Improved Mortality with CABG vs PCI in Multivessel Disease
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)