User login
Explaining obesity in cancer survivors
Researchers have identified several factors that may influence the risk of obesity in childhood cancer survivors.
Previous research showed that obesity rates are elevated in childhood cancer survivors who were exposed to cranial radiation.
But the new study has shown that other types of treatment, a patient’s age, and certain genetic variants are associated with obesity in this population.
Carmen Wilson, PhD, of St. Jude Children’s Research Hospital in Memphis, Tennessee, and her colleagues reported these findings in Cancer.
The researchers evaluated 1,996 childhood cancer survivors treated at St. Jude. The patients’ median age at diagnosis was 7.2 years (range, 0.1-24.8), and their median age at follow-up was 32.4 years (range, 18.9-63.8).
At the time of evaluation, 645 patients (32.3%) were of normal weight, 71 (3.6%) were underweight, 556 (27.9%) were overweight, and 723 (36.2%) were obese.
The prevalence of obesity was highest in male leukemia survivors (42.5%) and females who survived neuroblastoma (43.6%), followed closely by those who survived leukemia (43.1%).
Multivariable analyses showed that 3 factors were independently associated with an increased risk of obesity: older age at the time of evaluation (≥30 years vs <30 years; P<0.001), undergoing cranial radiation (P<0.001), and receiving glucocorticoids (P=0.004).
On the other hand, receiving chest, abdominal, or pelvic radiation was associated with a decreased risk of obesity (P<0.001).
The researchers also identified 166 single nucleotide polymorphisms that were associated with obesity among cancer survivors who had received cranial radiation. The strongest association was in variants of genes involved in neuron growth, repair, and connectivity.
Among survivors who did not receive cranial radiation, only 1 single nucleotide polymorphism—rs12073359, located on chromosome 1—was associated with an increased risk of obesity.
Dr Wilson said these findings might help us identify the childhood cancer survivors who are most likely to become obese. The results may also provide a foundation for future research efforts aimed at characterizing molecular pathways involved in the link between childhood cancer treatment and obesity.
Researchers have identified several factors that may influence the risk of obesity in childhood cancer survivors.
Previous research showed that obesity rates are elevated in childhood cancer survivors who were exposed to cranial radiation.
But the new study has shown that other types of treatment, a patient’s age, and certain genetic variants are associated with obesity in this population.
Carmen Wilson, PhD, of St. Jude Children’s Research Hospital in Memphis, Tennessee, and her colleagues reported these findings in Cancer.
The researchers evaluated 1,996 childhood cancer survivors treated at St. Jude. The patients’ median age at diagnosis was 7.2 years (range, 0.1-24.8), and their median age at follow-up was 32.4 years (range, 18.9-63.8).
At the time of evaluation, 645 patients (32.3%) were of normal weight, 71 (3.6%) were underweight, 556 (27.9%) were overweight, and 723 (36.2%) were obese.
The prevalence of obesity was highest in male leukemia survivors (42.5%) and females who survived neuroblastoma (43.6%), followed closely by those who survived leukemia (43.1%).
Multivariable analyses showed that 3 factors were independently associated with an increased risk of obesity: older age at the time of evaluation (≥30 years vs <30 years; P<0.001), undergoing cranial radiation (P<0.001), and receiving glucocorticoids (P=0.004).
On the other hand, receiving chest, abdominal, or pelvic radiation was associated with a decreased risk of obesity (P<0.001).
The researchers also identified 166 single nucleotide polymorphisms that were associated with obesity among cancer survivors who had received cranial radiation. The strongest association was in variants of genes involved in neuron growth, repair, and connectivity.
Among survivors who did not receive cranial radiation, only 1 single nucleotide polymorphism—rs12073359, located on chromosome 1—was associated with an increased risk of obesity.
Dr Wilson said these findings might help us identify the childhood cancer survivors who are most likely to become obese. The results may also provide a foundation for future research efforts aimed at characterizing molecular pathways involved in the link between childhood cancer treatment and obesity.
Researchers have identified several factors that may influence the risk of obesity in childhood cancer survivors.
Previous research showed that obesity rates are elevated in childhood cancer survivors who were exposed to cranial radiation.
But the new study has shown that other types of treatment, a patient’s age, and certain genetic variants are associated with obesity in this population.
Carmen Wilson, PhD, of St. Jude Children’s Research Hospital in Memphis, Tennessee, and her colleagues reported these findings in Cancer.
The researchers evaluated 1,996 childhood cancer survivors treated at St. Jude. The patients’ median age at diagnosis was 7.2 years (range, 0.1-24.8), and their median age at follow-up was 32.4 years (range, 18.9-63.8).
At the time of evaluation, 645 patients (32.3%) were of normal weight, 71 (3.6%) were underweight, 556 (27.9%) were overweight, and 723 (36.2%) were obese.
The prevalence of obesity was highest in male leukemia survivors (42.5%) and females who survived neuroblastoma (43.6%), followed closely by those who survived leukemia (43.1%).
Multivariable analyses showed that 3 factors were independently associated with an increased risk of obesity: older age at the time of evaluation (≥30 years vs <30 years; P<0.001), undergoing cranial radiation (P<0.001), and receiving glucocorticoids (P=0.004).
On the other hand, receiving chest, abdominal, or pelvic radiation was associated with a decreased risk of obesity (P<0.001).
The researchers also identified 166 single nucleotide polymorphisms that were associated with obesity among cancer survivors who had received cranial radiation. The strongest association was in variants of genes involved in neuron growth, repair, and connectivity.
Among survivors who did not receive cranial radiation, only 1 single nucleotide polymorphism—rs12073359, located on chromosome 1—was associated with an increased risk of obesity.
Dr Wilson said these findings might help us identify the childhood cancer survivors who are most likely to become obese. The results may also provide a foundation for future research efforts aimed at characterizing molecular pathways involved in the link between childhood cancer treatment and obesity.
Antimalarial drug unavailable, CDC says
Photo courtesy of the FDA
The antimalarial drug chloroquine is not currently available from US suppliers, according to the Centers for Disease Control and Prevention (CDC).
The agency said it will provide updates as more information becomes available from the Food and Drug Administration.
Chloroquine is used as malaria treatment and prophylaxis, but hydroxychloroquine sulfate can be prescribed in place of chloroquine when indicated.
Healthcare providers who need assistance diagnosing or managing suspected or confirmed cases of malaria can call the CDC Malaria Hotline at 1-855-856-4713 (Monday through Friday, 9 am to 5pm, Eastern time).
For emergency consultation after hours, providers can call 1-770-488-7100 and ask to speak with a CDC Malaria Branch clinician.
Photo courtesy of the FDA
The antimalarial drug chloroquine is not currently available from US suppliers, according to the Centers for Disease Control and Prevention (CDC).
The agency said it will provide updates as more information becomes available from the Food and Drug Administration.
Chloroquine is used as malaria treatment and prophylaxis, but hydroxychloroquine sulfate can be prescribed in place of chloroquine when indicated.
Healthcare providers who need assistance diagnosing or managing suspected or confirmed cases of malaria can call the CDC Malaria Hotline at 1-855-856-4713 (Monday through Friday, 9 am to 5pm, Eastern time).
For emergency consultation after hours, providers can call 1-770-488-7100 and ask to speak with a CDC Malaria Branch clinician.
Photo courtesy of the FDA
The antimalarial drug chloroquine is not currently available from US suppliers, according to the Centers for Disease Control and Prevention (CDC).
The agency said it will provide updates as more information becomes available from the Food and Drug Administration.
Chloroquine is used as malaria treatment and prophylaxis, but hydroxychloroquine sulfate can be prescribed in place of chloroquine when indicated.
Healthcare providers who need assistance diagnosing or managing suspected or confirmed cases of malaria can call the CDC Malaria Hotline at 1-855-856-4713 (Monday through Friday, 9 am to 5pm, Eastern time).
For emergency consultation after hours, providers can call 1-770-488-7100 and ask to speak with a CDC Malaria Branch clinician.
ABA: Childhood burn survivors risk more physical, mental disorders
CHICAGO – Adult survivors of childhood burns have significantly higher rates of Axis I mental and physical disorders years after their injury, a population-based study shows.
“We think it is really important to screen for, identify, and treat these illnesses not only in that acute period and shortly after the burn injury, but well into adulthood,” study author James Stone said at the annual meeting of the American Burn Association.
He reported on 745 adult burn survivors identified using administrative data from a regional pediatric burn center registry in Manitoba, Canada, who were matched 1:5 with 3,725 controls from the general Manitoba population based on age, sex, and geographic location. The burn survivors had an average age of 5.9 years at the time of burn injury, burns involved an average 12% of total body surface area, and 65% of burn survivors were male. The average follow-up was nearly 15 years (range 2.8-24.7 years).
In unadjusted univariate analysis, adult survivors had significantly higher rates than matched controls for any lifetime physical disorder (rate ratio, 1.17), arthritis (RR, 1.23), cancer (RR, 1.94), diabetes (RR, 1.69), fractures (RR, 1.45), and total respiratory morbidity (RR, 1.15).
After adjustment for gender, geography, and income, any physical disorder (RR, 1.15; P value < .01), arthritis (RR, 1.24; P < .01), fractures (RR, 1.37; P .001), and total respiratory morbidity (RR, 1.13; P < .05) remained significant, Mr. Stone, from the University of Manitoba, Winnipeg, Canada, reported.
Further, 81% of burn survivors had a lifetime physical illness compared with 69% of controls.
“The fact that 81% of our burn cohort was diagnosed with a physical illness is definitely concerning,” he said. “We hypothesize that the prolonged hyperinflammatory and hypermetabolic state that has been previously reported makes these individuals more susceptible to these illnesses down the road.”
The burn cohort also had significantly higher unadjusted rate ratios for any Axis 1 mental disorder (RR, 1.62), major depressive disorder (RR, 1.64), anxiety (RR, 1.57), substance abuse (RR, 2.86), and suicide attempts (RR, 5.00).
All disorders remained statistically significant after adjustment with rate ratios of 1.54 (P < .001), 1.54 (P < .001), 1.50 (P < .001), 2.35 (P < .001), and 4.33 (P < .01), respectively.
The high rates of substance abuse and suicide attempts are consistent with previous clinical interview studies, but are still cause for great concern, Mr. Stone said.
The risk for any mental or physical disorder was not significantly impacted by burn location or by burns that affected more than 30% of total body surface area. Age older than 5 years at the time of the burn significantly increased the risk of any mental disorder (relative risk, 1.92; P < .001).
Limitations of the study include the potential for bias because the data relied on individuals presenting to physicians, discrepancies between ICD codes for physician billings and hospital claims, and some survivors may have moved out of the province, Mr. Stone said. The study, however, had a sample size three times greater than the next largest study of its kind, and importantly, matched burn patients to the general population.
CHICAGO – Adult survivors of childhood burns have significantly higher rates of Axis I mental and physical disorders years after their injury, a population-based study shows.
“We think it is really important to screen for, identify, and treat these illnesses not only in that acute period and shortly after the burn injury, but well into adulthood,” study author James Stone said at the annual meeting of the American Burn Association.
He reported on 745 adult burn survivors identified using administrative data from a regional pediatric burn center registry in Manitoba, Canada, who were matched 1:5 with 3,725 controls from the general Manitoba population based on age, sex, and geographic location. The burn survivors had an average age of 5.9 years at the time of burn injury, burns involved an average 12% of total body surface area, and 65% of burn survivors were male. The average follow-up was nearly 15 years (range 2.8-24.7 years).
In unadjusted univariate analysis, adult survivors had significantly higher rates than matched controls for any lifetime physical disorder (rate ratio, 1.17), arthritis (RR, 1.23), cancer (RR, 1.94), diabetes (RR, 1.69), fractures (RR, 1.45), and total respiratory morbidity (RR, 1.15).
After adjustment for gender, geography, and income, any physical disorder (RR, 1.15; P value < .01), arthritis (RR, 1.24; P < .01), fractures (RR, 1.37; P .001), and total respiratory morbidity (RR, 1.13; P < .05) remained significant, Mr. Stone, from the University of Manitoba, Winnipeg, Canada, reported.
Further, 81% of burn survivors had a lifetime physical illness compared with 69% of controls.
“The fact that 81% of our burn cohort was diagnosed with a physical illness is definitely concerning,” he said. “We hypothesize that the prolonged hyperinflammatory and hypermetabolic state that has been previously reported makes these individuals more susceptible to these illnesses down the road.”
The burn cohort also had significantly higher unadjusted rate ratios for any Axis 1 mental disorder (RR, 1.62), major depressive disorder (RR, 1.64), anxiety (RR, 1.57), substance abuse (RR, 2.86), and suicide attempts (RR, 5.00).
All disorders remained statistically significant after adjustment with rate ratios of 1.54 (P < .001), 1.54 (P < .001), 1.50 (P < .001), 2.35 (P < .001), and 4.33 (P < .01), respectively.
The high rates of substance abuse and suicide attempts are consistent with previous clinical interview studies, but are still cause for great concern, Mr. Stone said.
The risk for any mental or physical disorder was not significantly impacted by burn location or by burns that affected more than 30% of total body surface area. Age older than 5 years at the time of the burn significantly increased the risk of any mental disorder (relative risk, 1.92; P < .001).
Limitations of the study include the potential for bias because the data relied on individuals presenting to physicians, discrepancies between ICD codes for physician billings and hospital claims, and some survivors may have moved out of the province, Mr. Stone said. The study, however, had a sample size three times greater than the next largest study of its kind, and importantly, matched burn patients to the general population.
CHICAGO – Adult survivors of childhood burns have significantly higher rates of Axis I mental and physical disorders years after their injury, a population-based study shows.
“We think it is really important to screen for, identify, and treat these illnesses not only in that acute period and shortly after the burn injury, but well into adulthood,” study author James Stone said at the annual meeting of the American Burn Association.
He reported on 745 adult burn survivors identified using administrative data from a regional pediatric burn center registry in Manitoba, Canada, who were matched 1:5 with 3,725 controls from the general Manitoba population based on age, sex, and geographic location. The burn survivors had an average age of 5.9 years at the time of burn injury, burns involved an average 12% of total body surface area, and 65% of burn survivors were male. The average follow-up was nearly 15 years (range 2.8-24.7 years).
In unadjusted univariate analysis, adult survivors had significantly higher rates than matched controls for any lifetime physical disorder (rate ratio, 1.17), arthritis (RR, 1.23), cancer (RR, 1.94), diabetes (RR, 1.69), fractures (RR, 1.45), and total respiratory morbidity (RR, 1.15).
After adjustment for gender, geography, and income, any physical disorder (RR, 1.15; P value < .01), arthritis (RR, 1.24; P < .01), fractures (RR, 1.37; P .001), and total respiratory morbidity (RR, 1.13; P < .05) remained significant, Mr. Stone, from the University of Manitoba, Winnipeg, Canada, reported.
Further, 81% of burn survivors had a lifetime physical illness compared with 69% of controls.
“The fact that 81% of our burn cohort was diagnosed with a physical illness is definitely concerning,” he said. “We hypothesize that the prolonged hyperinflammatory and hypermetabolic state that has been previously reported makes these individuals more susceptible to these illnesses down the road.”
The burn cohort also had significantly higher unadjusted rate ratios for any Axis 1 mental disorder (RR, 1.62), major depressive disorder (RR, 1.64), anxiety (RR, 1.57), substance abuse (RR, 2.86), and suicide attempts (RR, 5.00).
All disorders remained statistically significant after adjustment with rate ratios of 1.54 (P < .001), 1.54 (P < .001), 1.50 (P < .001), 2.35 (P < .001), and 4.33 (P < .01), respectively.
The high rates of substance abuse and suicide attempts are consistent with previous clinical interview studies, but are still cause for great concern, Mr. Stone said.
The risk for any mental or physical disorder was not significantly impacted by burn location or by burns that affected more than 30% of total body surface area. Age older than 5 years at the time of the burn significantly increased the risk of any mental disorder (relative risk, 1.92; P < .001).
Limitations of the study include the potential for bias because the data relied on individuals presenting to physicians, discrepancies between ICD codes for physician billings and hospital claims, and some survivors may have moved out of the province, Mr. Stone said. The study, however, had a sample size three times greater than the next largest study of its kind, and importantly, matched burn patients to the general population.
AT THE ABA ANNUAL MEETING
Key clinical point: Adult survivors of childhood burn injuries have increased rates of Axis I mental and physical disorders.
Major finding: 81% of burn survivors had a physical disorder vs. 69% of matched controls.
Data source: Population-based study in 745 adult survivors of childhood burns.
Disclosures: The study was funded by grants from the University of Manitoba and the Manitoba Firefighters Burn Fund. The authors declared no conflicts of interest.
HCV spike in four Appalachian states tied to drug abuse
Acute hepatitis C virus infections more than tripled among young people in Kentucky, Tennessee, Virginia, and West Virginia between 2006 and 2012, investigators reported online May 8 in Morbidity and Mortality Weekly Report.
“The increase in acute HCV infections in central Appalachia is highly correlated with the region’s epidemic of prescription opioid abuse and facilitated by an upsurge in the number of persons who inject drugs,” said Dr. Jon Zibbell at the Centers for Disease Control and Prevention and his associates.
Nationally, acute HCV infections have risen most steeply in states east of the Mississippi. To further explore the trend, the researchers examined HCV case data from the National Notifiable Disease Surveillance System, and data on 217,789 admissions to substance abuse treatment centers related to opioid or injection drug abuse (MMWR 2015;64:453-8).
Confirmed HCV cases among individuals aged 30 years and younger rose by 364% in the four Appalachian states during 2006-2012, the investigators found. “The increasing incidence among nonurban residents was at least double that of urban residents each year,” they said. Among patients with known risk factors for HCV infection, 73% reported injection drug use.
During the same time, treatment admissions for opioid dependency among individuals aged 12-29 years rose by 21% in the four states, and self-reported injection drug use rose by more than 12%, the researchers said. “Evidence-based strategies as well as integrated-service provision are urgently needed in drug treatment programs to ensure patients are tested for HCV, and persons found to be HCV infected are linked to care and receive appropriate treatment,” they concluded. “These efforts will require further collaboration among federal partners and state and local health departments to better address the syndemic of opioid abuse and HCV infection.”
The investigators declared no funding sources or financial conflicts of interest.
Acute hepatitis C virus infections more than tripled among young people in Kentucky, Tennessee, Virginia, and West Virginia between 2006 and 2012, investigators reported online May 8 in Morbidity and Mortality Weekly Report.
“The increase in acute HCV infections in central Appalachia is highly correlated with the region’s epidemic of prescription opioid abuse and facilitated by an upsurge in the number of persons who inject drugs,” said Dr. Jon Zibbell at the Centers for Disease Control and Prevention and his associates.
Nationally, acute HCV infections have risen most steeply in states east of the Mississippi. To further explore the trend, the researchers examined HCV case data from the National Notifiable Disease Surveillance System, and data on 217,789 admissions to substance abuse treatment centers related to opioid or injection drug abuse (MMWR 2015;64:453-8).
Confirmed HCV cases among individuals aged 30 years and younger rose by 364% in the four Appalachian states during 2006-2012, the investigators found. “The increasing incidence among nonurban residents was at least double that of urban residents each year,” they said. Among patients with known risk factors for HCV infection, 73% reported injection drug use.
During the same time, treatment admissions for opioid dependency among individuals aged 12-29 years rose by 21% in the four states, and self-reported injection drug use rose by more than 12%, the researchers said. “Evidence-based strategies as well as integrated-service provision are urgently needed in drug treatment programs to ensure patients are tested for HCV, and persons found to be HCV infected are linked to care and receive appropriate treatment,” they concluded. “These efforts will require further collaboration among federal partners and state and local health departments to better address the syndemic of opioid abuse and HCV infection.”
The investigators declared no funding sources or financial conflicts of interest.
Acute hepatitis C virus infections more than tripled among young people in Kentucky, Tennessee, Virginia, and West Virginia between 2006 and 2012, investigators reported online May 8 in Morbidity and Mortality Weekly Report.
“The increase in acute HCV infections in central Appalachia is highly correlated with the region’s epidemic of prescription opioid abuse and facilitated by an upsurge in the number of persons who inject drugs,” said Dr. Jon Zibbell at the Centers for Disease Control and Prevention and his associates.
Nationally, acute HCV infections have risen most steeply in states east of the Mississippi. To further explore the trend, the researchers examined HCV case data from the National Notifiable Disease Surveillance System, and data on 217,789 admissions to substance abuse treatment centers related to opioid or injection drug abuse (MMWR 2015;64:453-8).
Confirmed HCV cases among individuals aged 30 years and younger rose by 364% in the four Appalachian states during 2006-2012, the investigators found. “The increasing incidence among nonurban residents was at least double that of urban residents each year,” they said. Among patients with known risk factors for HCV infection, 73% reported injection drug use.
During the same time, treatment admissions for opioid dependency among individuals aged 12-29 years rose by 21% in the four states, and self-reported injection drug use rose by more than 12%, the researchers said. “Evidence-based strategies as well as integrated-service provision are urgently needed in drug treatment programs to ensure patients are tested for HCV, and persons found to be HCV infected are linked to care and receive appropriate treatment,” they concluded. “These efforts will require further collaboration among federal partners and state and local health departments to better address the syndemic of opioid abuse and HCV infection.”
The investigators declared no funding sources or financial conflicts of interest.
Key clinical point: Hepatitis C virus infections more than tripled among young people in Kentucky, Tennessee, Virginia, and West Virginia, and were strongly tied to rises in opioid and injection drug abuse.
Major finding: From 2006 to 2012, the number of acute HCV infections increased by 364% among individuals aged 30 years or less.
Data source: Analysis of HCV case data from the National Notifiable Disease Surveillance System and of substance abuse admissions data from the Treatment Episode Data Set.
Disclosures: The investigators reported no funding sources or financial conflicts of interest.
Be true to yourself
How often have nonphysicians told you that they could never work the hours you do?
Most people think physicians are a unique breed, and in some respects, we are. But in important ways we are just like everyone else. When we work long hours under stressful conditions and go without adequate sleep or nourishment, we cannot function at peak performance. Just like everyone else, we can become irritable, grumpy, and cynical when our basic needs are not met. We are human too, and we are at higher risk than most people for burnout, depression, and even suicide.
An article in the Journal of Hospital Medicine in 2014 noted that slightly over 50% of hospitalists were affected by burnout. We scored high on the emotional exhaustion subscale, and 40.3% of us had symptoms of depression, with a surprising 9.2% rate of recent suicidality. Hospital medicine definitely has its advantages over many other fields of medicine, but as this study demonstrates, there is still much to be desired in our “work-life balance.”
Each practice has its own perks and negatives, and what will enhance the lives of hospitalists in one group may make intolerable the lives of members of another group. For instance, it is no surprise that 12-hour shifts with 7-on, 7-off block scheduling can be exhausting. If you have a family, this schedule leaves plenty of fun time on the weeks you are off, but you may still be missing 50% of your family’s life if you leave for work before your kids wake up and return after they go to bed.
Whatever your concerns and stressors may be, rest assured, you are not alone, and if enough of the members of your group have similar issues, you may be successful addressing them with your director or hospital administrator. Retaining good hospitalists is vital to the financial success of many hospitals, and being flexible enough to truly meet their reasonable needs can literally make or break a hospitalist team.
How often have nonphysicians told you that they could never work the hours you do?
Most people think physicians are a unique breed, and in some respects, we are. But in important ways we are just like everyone else. When we work long hours under stressful conditions and go without adequate sleep or nourishment, we cannot function at peak performance. Just like everyone else, we can become irritable, grumpy, and cynical when our basic needs are not met. We are human too, and we are at higher risk than most people for burnout, depression, and even suicide.
An article in the Journal of Hospital Medicine in 2014 noted that slightly over 50% of hospitalists were affected by burnout. We scored high on the emotional exhaustion subscale, and 40.3% of us had symptoms of depression, with a surprising 9.2% rate of recent suicidality. Hospital medicine definitely has its advantages over many other fields of medicine, but as this study demonstrates, there is still much to be desired in our “work-life balance.”
Each practice has its own perks and negatives, and what will enhance the lives of hospitalists in one group may make intolerable the lives of members of another group. For instance, it is no surprise that 12-hour shifts with 7-on, 7-off block scheduling can be exhausting. If you have a family, this schedule leaves plenty of fun time on the weeks you are off, but you may still be missing 50% of your family’s life if you leave for work before your kids wake up and return after they go to bed.
Whatever your concerns and stressors may be, rest assured, you are not alone, and if enough of the members of your group have similar issues, you may be successful addressing them with your director or hospital administrator. Retaining good hospitalists is vital to the financial success of many hospitals, and being flexible enough to truly meet their reasonable needs can literally make or break a hospitalist team.
How often have nonphysicians told you that they could never work the hours you do?
Most people think physicians are a unique breed, and in some respects, we are. But in important ways we are just like everyone else. When we work long hours under stressful conditions and go without adequate sleep or nourishment, we cannot function at peak performance. Just like everyone else, we can become irritable, grumpy, and cynical when our basic needs are not met. We are human too, and we are at higher risk than most people for burnout, depression, and even suicide.
An article in the Journal of Hospital Medicine in 2014 noted that slightly over 50% of hospitalists were affected by burnout. We scored high on the emotional exhaustion subscale, and 40.3% of us had symptoms of depression, with a surprising 9.2% rate of recent suicidality. Hospital medicine definitely has its advantages over many other fields of medicine, but as this study demonstrates, there is still much to be desired in our “work-life balance.”
Each practice has its own perks and negatives, and what will enhance the lives of hospitalists in one group may make intolerable the lives of members of another group. For instance, it is no surprise that 12-hour shifts with 7-on, 7-off block scheduling can be exhausting. If you have a family, this schedule leaves plenty of fun time on the weeks you are off, but you may still be missing 50% of your family’s life if you leave for work before your kids wake up and return after they go to bed.
Whatever your concerns and stressors may be, rest assured, you are not alone, and if enough of the members of your group have similar issues, you may be successful addressing them with your director or hospital administrator. Retaining good hospitalists is vital to the financial success of many hospitals, and being flexible enough to truly meet their reasonable needs can literally make or break a hospitalist team.
Impotence drug could prevent malaria transmission
blood cell that has stiffened
after treatment
© 2015 Ramdani et al.
The erectile dysfunction drug sildenafil (Viagra) could prevent transmission of the malaria parasite, according to research published in PLOS Pathogens.
Investigators found that sildenafil increases the stiffness of erythrocytes infected by the parasite Plasmodium falciparum.
This allows the cells to be eliminated from the bloodstream and may therefore reduce transmission of the malaria parasite from humans to mosquitoes.
The investigators noted that P falciparum has a complex developmental cycle that is completed partly in humans and partly in mosquitoes. Treatments for malaria target the asexual forms of this parasite that cause symptoms, but not the sexual forms transmitted from a human to a mosquito.
Malaria eradication therefore necessitates new types of treatments against sexual forms of the parasite in order to block transmission and prevent dissemination of the disease.
The sexual forms of P falciparum develop in human erythrocytes sequestered in the bone marrow before they are released into the blood. They are then accessible to mosquitoes, which can ingest them when they bite.
Circulating erythrocytes are deformable, thus preventing their clearance via the spleen. And gametocyte-infected erythrocytes can easily pass through the spleen and persist for several days in the blood circulation.
With this in mind, Ghania Ramdani, of Université Paris Descartes in France, and colleagues sought to stiffen the infected erythrocytes so they would be removed from circulation.
The team found that the deformability of gametocyte-infected erythrocytes is regulated by a signaling pathway that involves cAMP. When cAMP molecules accumulate, an erythrocyte becomes stiffer. And cAMP is degraded by the enzyme phosphodiesterase, which promotes erythrocyte deformability.
Using an in vitro model reproducing filtration by the spleen, the investigators were able to identify several pharmacological agents that inhibit phosophodiesterases and can therefore increase the stiffness of infected erythrocytes.
One of these agents is sildenafil. The team showed that a standard dose of the drug had the potential to increase the stiffness of sexual forms of the parasite and therefore favor the elimination of infected erythrocytes from the circulation.
This discovery could lead to new ways to stop the spread of malaria, the investigators said. They believe that modifying the active substance in sildenafil to block its erectile effect, or testing similar agents devoid of this effect, could indeed result in a treatment to prevent transmission of the parasite from humans to mosquitoes.
blood cell that has stiffened
after treatment
© 2015 Ramdani et al.
The erectile dysfunction drug sildenafil (Viagra) could prevent transmission of the malaria parasite, according to research published in PLOS Pathogens.
Investigators found that sildenafil increases the stiffness of erythrocytes infected by the parasite Plasmodium falciparum.
This allows the cells to be eliminated from the bloodstream and may therefore reduce transmission of the malaria parasite from humans to mosquitoes.
The investigators noted that P falciparum has a complex developmental cycle that is completed partly in humans and partly in mosquitoes. Treatments for malaria target the asexual forms of this parasite that cause symptoms, but not the sexual forms transmitted from a human to a mosquito.
Malaria eradication therefore necessitates new types of treatments against sexual forms of the parasite in order to block transmission and prevent dissemination of the disease.
The sexual forms of P falciparum develop in human erythrocytes sequestered in the bone marrow before they are released into the blood. They are then accessible to mosquitoes, which can ingest them when they bite.
Circulating erythrocytes are deformable, thus preventing their clearance via the spleen. And gametocyte-infected erythrocytes can easily pass through the spleen and persist for several days in the blood circulation.
With this in mind, Ghania Ramdani, of Université Paris Descartes in France, and colleagues sought to stiffen the infected erythrocytes so they would be removed from circulation.
The team found that the deformability of gametocyte-infected erythrocytes is regulated by a signaling pathway that involves cAMP. When cAMP molecules accumulate, an erythrocyte becomes stiffer. And cAMP is degraded by the enzyme phosphodiesterase, which promotes erythrocyte deformability.
Using an in vitro model reproducing filtration by the spleen, the investigators were able to identify several pharmacological agents that inhibit phosophodiesterases and can therefore increase the stiffness of infected erythrocytes.
One of these agents is sildenafil. The team showed that a standard dose of the drug had the potential to increase the stiffness of sexual forms of the parasite and therefore favor the elimination of infected erythrocytes from the circulation.
This discovery could lead to new ways to stop the spread of malaria, the investigators said. They believe that modifying the active substance in sildenafil to block its erectile effect, or testing similar agents devoid of this effect, could indeed result in a treatment to prevent transmission of the parasite from humans to mosquitoes.
blood cell that has stiffened
after treatment
© 2015 Ramdani et al.
The erectile dysfunction drug sildenafil (Viagra) could prevent transmission of the malaria parasite, according to research published in PLOS Pathogens.
Investigators found that sildenafil increases the stiffness of erythrocytes infected by the parasite Plasmodium falciparum.
This allows the cells to be eliminated from the bloodstream and may therefore reduce transmission of the malaria parasite from humans to mosquitoes.
The investigators noted that P falciparum has a complex developmental cycle that is completed partly in humans and partly in mosquitoes. Treatments for malaria target the asexual forms of this parasite that cause symptoms, but not the sexual forms transmitted from a human to a mosquito.
Malaria eradication therefore necessitates new types of treatments against sexual forms of the parasite in order to block transmission and prevent dissemination of the disease.
The sexual forms of P falciparum develop in human erythrocytes sequestered in the bone marrow before they are released into the blood. They are then accessible to mosquitoes, which can ingest them when they bite.
Circulating erythrocytes are deformable, thus preventing their clearance via the spleen. And gametocyte-infected erythrocytes can easily pass through the spleen and persist for several days in the blood circulation.
With this in mind, Ghania Ramdani, of Université Paris Descartes in France, and colleagues sought to stiffen the infected erythrocytes so they would be removed from circulation.
The team found that the deformability of gametocyte-infected erythrocytes is regulated by a signaling pathway that involves cAMP. When cAMP molecules accumulate, an erythrocyte becomes stiffer. And cAMP is degraded by the enzyme phosphodiesterase, which promotes erythrocyte deformability.
Using an in vitro model reproducing filtration by the spleen, the investigators were able to identify several pharmacological agents that inhibit phosophodiesterases and can therefore increase the stiffness of infected erythrocytes.
One of these agents is sildenafil. The team showed that a standard dose of the drug had the potential to increase the stiffness of sexual forms of the parasite and therefore favor the elimination of infected erythrocytes from the circulation.
This discovery could lead to new ways to stop the spread of malaria, the investigators said. They believe that modifying the active substance in sildenafil to block its erectile effect, or testing similar agents devoid of this effect, could indeed result in a treatment to prevent transmission of the parasite from humans to mosquitoes.
Eat slowly to reduce consumed calories
I freely admit I am obsessed with research articles about eating habits. I hold out hope that this will eventually unlock the magic bullet to cure us of the modern plague of obesity. At a certain level, our patients need us to be captivated by such literature. We should feel fairly comfortable with the common knowledge that diets are effective if you stay on them and reducing the caloric density of foods can result in meaningful weight loss.
But what about how quickly we eat? In our fast-paced, heavily caffeinated society, we seem to shovel rather than chew. Ever since I was a medical resident, I have practically inhaled my food. Perchance I am operating under the erroneous and illogical assumption that if I don’t taste the food it won’t register as calories. True science has now enlightened me to the error in my thinking.
Dr. Eric Robinson and his colleagues conducted a brilliant systematic review of the impact of eating rate on energy intake and hunger (Am. J. Clin. Nutr. 2014;100:123-51). They included studies for which there was at least one study arm in which participants ate a meal at a statistically significant slower rate than that of a different arm. Twenty-two studies met the criteria for inclusion.
Available evidence suggests that a slower eating rate is associated with lower intake, compared with faster eating. The effect on caloric intake was observed regardless of the intervention used to modify the eating rate, such as modifying food from soft (fast rate) to hard (slow rate) or verbal instruction. No relationship was observed between eating rate and hunger at the end of the meal or several hours later.
Intriguing to me is the hypothesis that eating rate likely affects intake through the duration and intensity of oral exposure to taste. Previous studies have shown that, when eating rate is held constant, increasing sensory exposure leads to a lower energy intake. This seems to relate to our innate wiring that gives us a “sensory specific satiety.” In my understanding, sensory specific satiety turns off appetitive drive when you have had too much chocolate or too many potato chips and you feel slightly ill. Unfortunately, the food industry is on to this game and they have designed foods to be perfectly balanced to not render satiety. These foods can tragically be eaten ceaselessly.
Take-home message: If your patients cannot control the bad foods they eat, they should try to eat them more slowly.
Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician.
I freely admit I am obsessed with research articles about eating habits. I hold out hope that this will eventually unlock the magic bullet to cure us of the modern plague of obesity. At a certain level, our patients need us to be captivated by such literature. We should feel fairly comfortable with the common knowledge that diets are effective if you stay on them and reducing the caloric density of foods can result in meaningful weight loss.
But what about how quickly we eat? In our fast-paced, heavily caffeinated society, we seem to shovel rather than chew. Ever since I was a medical resident, I have practically inhaled my food. Perchance I am operating under the erroneous and illogical assumption that if I don’t taste the food it won’t register as calories. True science has now enlightened me to the error in my thinking.
Dr. Eric Robinson and his colleagues conducted a brilliant systematic review of the impact of eating rate on energy intake and hunger (Am. J. Clin. Nutr. 2014;100:123-51). They included studies for which there was at least one study arm in which participants ate a meal at a statistically significant slower rate than that of a different arm. Twenty-two studies met the criteria for inclusion.
Available evidence suggests that a slower eating rate is associated with lower intake, compared with faster eating. The effect on caloric intake was observed regardless of the intervention used to modify the eating rate, such as modifying food from soft (fast rate) to hard (slow rate) or verbal instruction. No relationship was observed between eating rate and hunger at the end of the meal or several hours later.
Intriguing to me is the hypothesis that eating rate likely affects intake through the duration and intensity of oral exposure to taste. Previous studies have shown that, when eating rate is held constant, increasing sensory exposure leads to a lower energy intake. This seems to relate to our innate wiring that gives us a “sensory specific satiety.” In my understanding, sensory specific satiety turns off appetitive drive when you have had too much chocolate or too many potato chips and you feel slightly ill. Unfortunately, the food industry is on to this game and they have designed foods to be perfectly balanced to not render satiety. These foods can tragically be eaten ceaselessly.
Take-home message: If your patients cannot control the bad foods they eat, they should try to eat them more slowly.
Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician.
I freely admit I am obsessed with research articles about eating habits. I hold out hope that this will eventually unlock the magic bullet to cure us of the modern plague of obesity. At a certain level, our patients need us to be captivated by such literature. We should feel fairly comfortable with the common knowledge that diets are effective if you stay on them and reducing the caloric density of foods can result in meaningful weight loss.
But what about how quickly we eat? In our fast-paced, heavily caffeinated society, we seem to shovel rather than chew. Ever since I was a medical resident, I have practically inhaled my food. Perchance I am operating under the erroneous and illogical assumption that if I don’t taste the food it won’t register as calories. True science has now enlightened me to the error in my thinking.
Dr. Eric Robinson and his colleagues conducted a brilliant systematic review of the impact of eating rate on energy intake and hunger (Am. J. Clin. Nutr. 2014;100:123-51). They included studies for which there was at least one study arm in which participants ate a meal at a statistically significant slower rate than that of a different arm. Twenty-two studies met the criteria for inclusion.
Available evidence suggests that a slower eating rate is associated with lower intake, compared with faster eating. The effect on caloric intake was observed regardless of the intervention used to modify the eating rate, such as modifying food from soft (fast rate) to hard (slow rate) or verbal instruction. No relationship was observed between eating rate and hunger at the end of the meal or several hours later.
Intriguing to me is the hypothesis that eating rate likely affects intake through the duration and intensity of oral exposure to taste. Previous studies have shown that, when eating rate is held constant, increasing sensory exposure leads to a lower energy intake. This seems to relate to our innate wiring that gives us a “sensory specific satiety.” In my understanding, sensory specific satiety turns off appetitive drive when you have had too much chocolate or too many potato chips and you feel slightly ill. Unfortunately, the food industry is on to this game and they have designed foods to be perfectly balanced to not render satiety. These foods can tragically be eaten ceaselessly.
Take-home message: If your patients cannot control the bad foods they eat, they should try to eat them more slowly.
Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician.
Inhibitor may benefit certain ALL patients
PHILADELPHIA—Results of preclinical research suggest the BCL-2 inhibitor ABT-199 (venetoclax) may be effective in certain pediatric patients with acute lymphoblastic leukemia (ALL).
In xenograft models of various ALL subtypes, ABT-199 produced an objective response rate below 30%.
However, additional analyses unearthed information that could potentially help us identify which ALL patients might respond to the drug.
Santi Suryani, PhD, of the Children’s Cancer Institute in Sydney, New South Wales, Australia, and her colleagues presented this research at the AACR Annual Meeting 2015 (abstract 3276*). The work was supported by AbbVie, one of the companies developing ABT-199.
Dr Suryani and her colleagues decided to investigate ABT-199 in pediatric ALL after observing mixed results with the BCL-2/BCL-W/BCL-XL inhibitor ABT-263 (navitoclax).
ABT-263 delayed ALL progression in nearly all of the xenograft models the team tested and produced a 61% response rate. However, the drug also induced BCL-XL-mediated thrombocytopenia.
As ABT-199 doesn’t target BCL-XL, the researchers thought the drug might produce similar responses as ABT-263 without inducing thrombocytopenia.
“When ABT-199 came into the picture, we were very excited,” Dr Suryani said. “We thought, ‘This is a wonder drug. This will cure pediatric ALL.’”
To test this hypothesis, the team compared ABT-199 (100 mg/kg x 21 days) and vehicle control in 19 pediatric ALL patient-derived xenografts, including infant mixed-lineage leukemia (MLL) ALL (n=4), B-cell precursor (BCP) ALL (n=5), BCP-ALL categorized as Ph-like (n=4), T-cell ALL (n=4), and early T-cell precursor (ETP) ALL (n=2).
ABT-199 significantly delayed progression in 12 xenografts (63%) for periods ranging from 0.4 days to 28 days. And the drug produced objective responses in 5 xenografts (26%).
Responses occurred in MLL-ALL, BCP-ALL, and Ph-like BCP ALL, but not T-cell ALL or ETP-ALL. Complete responses were seen in MLL-ALL (n=1) and BCP-ALL (n=2), and partial responses occurred in MLL-ALL (n=1) and Ph-like BCP-ALL (n=1).
As the response rate with ABT-263 was more than double that of ABT-199 (61% vs 26%), the researchers found the results with ABT-199 “a little bit disappointing,” according to Dr Suryani.
“But we thought, ‘That’s okay. That already tells us the science behind it—that pediatric ALL is probably more BCL-XL-dependent, rather than BCL-2-dependent,’” she said. “We wondered if there was any way we could come up with a predictive biomarker so we could select patients who will benefit from this treatment.”
With that in mind, the researchers evaluated the link between protein expression and response. They looked at BCL-2 and BCL-XL, as well as a range of other proteins, including BCL-W, MCL1, BAK1, and BAX, among others.
And they found that high BCL-XL and low BCL-2 expression were significantly associated with ABT-199 resistance.
The researchers are still investigating ways to guide treatment with ABT-199 in ALL. They are also hoping to improve responses by administering the drug in combination with other agents.
*Information in the abstract differs from that presented at the meeting.
PHILADELPHIA—Results of preclinical research suggest the BCL-2 inhibitor ABT-199 (venetoclax) may be effective in certain pediatric patients with acute lymphoblastic leukemia (ALL).
In xenograft models of various ALL subtypes, ABT-199 produced an objective response rate below 30%.
However, additional analyses unearthed information that could potentially help us identify which ALL patients might respond to the drug.
Santi Suryani, PhD, of the Children’s Cancer Institute in Sydney, New South Wales, Australia, and her colleagues presented this research at the AACR Annual Meeting 2015 (abstract 3276*). The work was supported by AbbVie, one of the companies developing ABT-199.
Dr Suryani and her colleagues decided to investigate ABT-199 in pediatric ALL after observing mixed results with the BCL-2/BCL-W/BCL-XL inhibitor ABT-263 (navitoclax).
ABT-263 delayed ALL progression in nearly all of the xenograft models the team tested and produced a 61% response rate. However, the drug also induced BCL-XL-mediated thrombocytopenia.
As ABT-199 doesn’t target BCL-XL, the researchers thought the drug might produce similar responses as ABT-263 without inducing thrombocytopenia.
“When ABT-199 came into the picture, we were very excited,” Dr Suryani said. “We thought, ‘This is a wonder drug. This will cure pediatric ALL.’”
To test this hypothesis, the team compared ABT-199 (100 mg/kg x 21 days) and vehicle control in 19 pediatric ALL patient-derived xenografts, including infant mixed-lineage leukemia (MLL) ALL (n=4), B-cell precursor (BCP) ALL (n=5), BCP-ALL categorized as Ph-like (n=4), T-cell ALL (n=4), and early T-cell precursor (ETP) ALL (n=2).
ABT-199 significantly delayed progression in 12 xenografts (63%) for periods ranging from 0.4 days to 28 days. And the drug produced objective responses in 5 xenografts (26%).
Responses occurred in MLL-ALL, BCP-ALL, and Ph-like BCP ALL, but not T-cell ALL or ETP-ALL. Complete responses were seen in MLL-ALL (n=1) and BCP-ALL (n=2), and partial responses occurred in MLL-ALL (n=1) and Ph-like BCP-ALL (n=1).
As the response rate with ABT-263 was more than double that of ABT-199 (61% vs 26%), the researchers found the results with ABT-199 “a little bit disappointing,” according to Dr Suryani.
“But we thought, ‘That’s okay. That already tells us the science behind it—that pediatric ALL is probably more BCL-XL-dependent, rather than BCL-2-dependent,’” she said. “We wondered if there was any way we could come up with a predictive biomarker so we could select patients who will benefit from this treatment.”
With that in mind, the researchers evaluated the link between protein expression and response. They looked at BCL-2 and BCL-XL, as well as a range of other proteins, including BCL-W, MCL1, BAK1, and BAX, among others.
And they found that high BCL-XL and low BCL-2 expression were significantly associated with ABT-199 resistance.
The researchers are still investigating ways to guide treatment with ABT-199 in ALL. They are also hoping to improve responses by administering the drug in combination with other agents.
*Information in the abstract differs from that presented at the meeting.
PHILADELPHIA—Results of preclinical research suggest the BCL-2 inhibitor ABT-199 (venetoclax) may be effective in certain pediatric patients with acute lymphoblastic leukemia (ALL).
In xenograft models of various ALL subtypes, ABT-199 produced an objective response rate below 30%.
However, additional analyses unearthed information that could potentially help us identify which ALL patients might respond to the drug.
Santi Suryani, PhD, of the Children’s Cancer Institute in Sydney, New South Wales, Australia, and her colleagues presented this research at the AACR Annual Meeting 2015 (abstract 3276*). The work was supported by AbbVie, one of the companies developing ABT-199.
Dr Suryani and her colleagues decided to investigate ABT-199 in pediatric ALL after observing mixed results with the BCL-2/BCL-W/BCL-XL inhibitor ABT-263 (navitoclax).
ABT-263 delayed ALL progression in nearly all of the xenograft models the team tested and produced a 61% response rate. However, the drug also induced BCL-XL-mediated thrombocytopenia.
As ABT-199 doesn’t target BCL-XL, the researchers thought the drug might produce similar responses as ABT-263 without inducing thrombocytopenia.
“When ABT-199 came into the picture, we were very excited,” Dr Suryani said. “We thought, ‘This is a wonder drug. This will cure pediatric ALL.’”
To test this hypothesis, the team compared ABT-199 (100 mg/kg x 21 days) and vehicle control in 19 pediatric ALL patient-derived xenografts, including infant mixed-lineage leukemia (MLL) ALL (n=4), B-cell precursor (BCP) ALL (n=5), BCP-ALL categorized as Ph-like (n=4), T-cell ALL (n=4), and early T-cell precursor (ETP) ALL (n=2).
ABT-199 significantly delayed progression in 12 xenografts (63%) for periods ranging from 0.4 days to 28 days. And the drug produced objective responses in 5 xenografts (26%).
Responses occurred in MLL-ALL, BCP-ALL, and Ph-like BCP ALL, but not T-cell ALL or ETP-ALL. Complete responses were seen in MLL-ALL (n=1) and BCP-ALL (n=2), and partial responses occurred in MLL-ALL (n=1) and Ph-like BCP-ALL (n=1).
As the response rate with ABT-263 was more than double that of ABT-199 (61% vs 26%), the researchers found the results with ABT-199 “a little bit disappointing,” according to Dr Suryani.
“But we thought, ‘That’s okay. That already tells us the science behind it—that pediatric ALL is probably more BCL-XL-dependent, rather than BCL-2-dependent,’” she said. “We wondered if there was any way we could come up with a predictive biomarker so we could select patients who will benefit from this treatment.”
With that in mind, the researchers evaluated the link between protein expression and response. They looked at BCL-2 and BCL-XL, as well as a range of other proteins, including BCL-W, MCL1, BAK1, and BAX, among others.
And they found that high BCL-XL and low BCL-2 expression were significantly associated with ABT-199 resistance.
The researchers are still investigating ways to guide treatment with ABT-199 in ALL. They are also hoping to improve responses by administering the drug in combination with other agents.
*Information in the abstract differs from that presented at the meeting.
AAN: Scheduled daily DBS effective in small Tourette syndrome study
WASHINGTON – Scheduled administration of bilateral deep brain stimulation of the centromedian thalamus for less than 2 hours a day resulted in a significant reduction in tics in several patients with Tourette syndrome over 2 years in a proof-of-concept study presented at the annual meeting of the American Academy of Neurology.
Of the four patients who completed the 24-month study, three experienced significant improvements, said Justin Rossi, an MD-PhD candidate at the University of Florida in Gainesville.
Instead of using the standard continuous deep brain stimulation (DBS), Mr. Rossi and colleagues at the university's Center for Movement Disorders and Neurorestoration evaluated a scheduled, personalized stimulation approach, with stimulation of the centromedian thalamus (bilaterally) tailored to the times of the day when patients experienced the most sequelae from the tics, such as when they were driving, exercising, or working, and when the intensity of the tics was the greatest.
The rationale for investigating this approach is that instead of using the “classical continuous approach” to DBS, a tailored approach might be effective in these patients, with the potential benefits of increasing battery life (and delaying another surgical procedure to replace the battery) and reducing side effects associated with stimulation, Mr. Rossi said.
Many studies have found that DBS is effective in “select medication-refractory cases of Tourette syndrome,” he noted. “However, in contrast to Parkinson’s disease, essential tremor, and other movement disorders for which DBS has been commonly used as a therapy, Tourette syndrome is a paroxysmal disorder,” and the frequency of tics can vary from patient to patient, with individual patients reporting that the intensity of tics “waxes and wanes throughout the day, often predictably.”
The study enrolled five patients; responses were evaluated with two rating scales, the Yale Global Tic Severity Scale (YGTSS) and the Modified Rush Video-Based Tic Rating Scale (MRTRS). A patient was considered a responder if there was more than a 40% improvement in the YGTSS or MRTRS from the preoperative baseline level, at 24 months, the primary outcome. (One patient was lost to follow-up after 18 months because the center was too far away.) Patients had the opportunity to modify the schedule at each 6-month visit.
At 24 months, the YGTSS total scores improved by 46%, 58%, and 17% and the MRTRS total scores improved by 79%, 81%, and 44% in the three responders. These patients had a mean stimulation time of 1.85 hours a day, ranging from 47 to 186 minutes per day. The one patient who did not meet the primary endpoint – with a 10% response on the YGTSS and a 21% response in the MRTRS – had the greatest amount of stimulation per day (4 hours a day). At 24 months, the responders had statistically significant improvements from baseline in components of the two scales, including the number of phonic tics per minute, motor tic severity, and phonic tic severity, Mr. Rossi said.
This is a proof-of-concept study and the results and conclusions are preliminary, but the results “warrant larger studies,” he concluded.
More research is needed to understand this mechanism on a more physiological level, which is being pursued at his center, he added. The results shed some light on whether the mechanism of DBS in Tourette syndrome is a cumulative effect of stimulation over time or whether DBS has an effect around the time the tics occur, and these results support the latter explanation, Mr. Rossi speculated.
He had no disclosures. The study was sponsored by the National Institutes of Health.
WASHINGTON – Scheduled administration of bilateral deep brain stimulation of the centromedian thalamus for less than 2 hours a day resulted in a significant reduction in tics in several patients with Tourette syndrome over 2 years in a proof-of-concept study presented at the annual meeting of the American Academy of Neurology.
Of the four patients who completed the 24-month study, three experienced significant improvements, said Justin Rossi, an MD-PhD candidate at the University of Florida in Gainesville.
Instead of using the standard continuous deep brain stimulation (DBS), Mr. Rossi and colleagues at the university's Center for Movement Disorders and Neurorestoration evaluated a scheduled, personalized stimulation approach, with stimulation of the centromedian thalamus (bilaterally) tailored to the times of the day when patients experienced the most sequelae from the tics, such as when they were driving, exercising, or working, and when the intensity of the tics was the greatest.
The rationale for investigating this approach is that instead of using the “classical continuous approach” to DBS, a tailored approach might be effective in these patients, with the potential benefits of increasing battery life (and delaying another surgical procedure to replace the battery) and reducing side effects associated with stimulation, Mr. Rossi said.
Many studies have found that DBS is effective in “select medication-refractory cases of Tourette syndrome,” he noted. “However, in contrast to Parkinson’s disease, essential tremor, and other movement disorders for which DBS has been commonly used as a therapy, Tourette syndrome is a paroxysmal disorder,” and the frequency of tics can vary from patient to patient, with individual patients reporting that the intensity of tics “waxes and wanes throughout the day, often predictably.”
The study enrolled five patients; responses were evaluated with two rating scales, the Yale Global Tic Severity Scale (YGTSS) and the Modified Rush Video-Based Tic Rating Scale (MRTRS). A patient was considered a responder if there was more than a 40% improvement in the YGTSS or MRTRS from the preoperative baseline level, at 24 months, the primary outcome. (One patient was lost to follow-up after 18 months because the center was too far away.) Patients had the opportunity to modify the schedule at each 6-month visit.
At 24 months, the YGTSS total scores improved by 46%, 58%, and 17% and the MRTRS total scores improved by 79%, 81%, and 44% in the three responders. These patients had a mean stimulation time of 1.85 hours a day, ranging from 47 to 186 minutes per day. The one patient who did not meet the primary endpoint – with a 10% response on the YGTSS and a 21% response in the MRTRS – had the greatest amount of stimulation per day (4 hours a day). At 24 months, the responders had statistically significant improvements from baseline in components of the two scales, including the number of phonic tics per minute, motor tic severity, and phonic tic severity, Mr. Rossi said.
This is a proof-of-concept study and the results and conclusions are preliminary, but the results “warrant larger studies,” he concluded.
More research is needed to understand this mechanism on a more physiological level, which is being pursued at his center, he added. The results shed some light on whether the mechanism of DBS in Tourette syndrome is a cumulative effect of stimulation over time or whether DBS has an effect around the time the tics occur, and these results support the latter explanation, Mr. Rossi speculated.
He had no disclosures. The study was sponsored by the National Institutes of Health.
WASHINGTON – Scheduled administration of bilateral deep brain stimulation of the centromedian thalamus for less than 2 hours a day resulted in a significant reduction in tics in several patients with Tourette syndrome over 2 years in a proof-of-concept study presented at the annual meeting of the American Academy of Neurology.
Of the four patients who completed the 24-month study, three experienced significant improvements, said Justin Rossi, an MD-PhD candidate at the University of Florida in Gainesville.
Instead of using the standard continuous deep brain stimulation (DBS), Mr. Rossi and colleagues at the university's Center for Movement Disorders and Neurorestoration evaluated a scheduled, personalized stimulation approach, with stimulation of the centromedian thalamus (bilaterally) tailored to the times of the day when patients experienced the most sequelae from the tics, such as when they were driving, exercising, or working, and when the intensity of the tics was the greatest.
The rationale for investigating this approach is that instead of using the “classical continuous approach” to DBS, a tailored approach might be effective in these patients, with the potential benefits of increasing battery life (and delaying another surgical procedure to replace the battery) and reducing side effects associated with stimulation, Mr. Rossi said.
Many studies have found that DBS is effective in “select medication-refractory cases of Tourette syndrome,” he noted. “However, in contrast to Parkinson’s disease, essential tremor, and other movement disorders for which DBS has been commonly used as a therapy, Tourette syndrome is a paroxysmal disorder,” and the frequency of tics can vary from patient to patient, with individual patients reporting that the intensity of tics “waxes and wanes throughout the day, often predictably.”
The study enrolled five patients; responses were evaluated with two rating scales, the Yale Global Tic Severity Scale (YGTSS) and the Modified Rush Video-Based Tic Rating Scale (MRTRS). A patient was considered a responder if there was more than a 40% improvement in the YGTSS or MRTRS from the preoperative baseline level, at 24 months, the primary outcome. (One patient was lost to follow-up after 18 months because the center was too far away.) Patients had the opportunity to modify the schedule at each 6-month visit.
At 24 months, the YGTSS total scores improved by 46%, 58%, and 17% and the MRTRS total scores improved by 79%, 81%, and 44% in the three responders. These patients had a mean stimulation time of 1.85 hours a day, ranging from 47 to 186 minutes per day. The one patient who did not meet the primary endpoint – with a 10% response on the YGTSS and a 21% response in the MRTRS – had the greatest amount of stimulation per day (4 hours a day). At 24 months, the responders had statistically significant improvements from baseline in components of the two scales, including the number of phonic tics per minute, motor tic severity, and phonic tic severity, Mr. Rossi said.
This is a proof-of-concept study and the results and conclusions are preliminary, but the results “warrant larger studies,” he concluded.
More research is needed to understand this mechanism on a more physiological level, which is being pursued at his center, he added. The results shed some light on whether the mechanism of DBS in Tourette syndrome is a cumulative effect of stimulation over time or whether DBS has an effect around the time the tics occur, and these results support the latter explanation, Mr. Rossi speculated.
He had no disclosures. The study was sponsored by the National Institutes of Health.
AT THE AAN 2015 ANNUAL MEETING
Key clinical point: Promising results of a tailored approach to deep brain stimulation in three patients with Tourette syndrome merits a larger trial.
Major finding: In three of the four patients who completed the study, DBS of the centromedian thalamus for less than 2 hours a day resulted in significant improvements over 24 months.
Data source: A proof-of-concept study in five patients with Tourette syndrome, evaluating DBS of the centromedian thalamus, scheduled for times when tics interfered with activities or were most intense.
Disclosures: The National Institutes of Health sponsored the study. Mr. Rossi had no disclosures.
Obesity increases risk of bleeding on warfarin
Obese patients on warfarin may be at greater risk of bleeding than those of normal weight, according to a study presented at the American Heart Association’s Arteriosclerosis, Thrombosis, and Vascular Biology/Peripheral Vascular Disease Scientific Sessions 2015.
Researchers followed 863 patients attending an anticoagulation clinic for 1 year and found that obesity (body mass index greater than 30 kg/m2) was associated with a statistically significant 84% increase in the risk of major bleeds, such as gastrointestinal, intracerebral, and retroperitoneal hemorrhage.
The study also showed that increasing obesity increased bleeding risk; there was a 30% increase in bleeding risk for patients with class I obesity but a 93% increase in patients with class III obesity.
“This result suggests that BMI plays a role in bleeding events in patients on warfarin [and] future studies are needed to understand the mechanism by which obesity increases bleeding risk for patients on warfarin, and whether similar risk exists for the novel oral anticoagulants,” said Dr. Adedotun A. Ogunsua of the University of Massachusetts, Worcester, and coauthors.
There were no conflicts of interest disclosed.
Obese patients on warfarin may be at greater risk of bleeding than those of normal weight, according to a study presented at the American Heart Association’s Arteriosclerosis, Thrombosis, and Vascular Biology/Peripheral Vascular Disease Scientific Sessions 2015.
Researchers followed 863 patients attending an anticoagulation clinic for 1 year and found that obesity (body mass index greater than 30 kg/m2) was associated with a statistically significant 84% increase in the risk of major bleeds, such as gastrointestinal, intracerebral, and retroperitoneal hemorrhage.
The study also showed that increasing obesity increased bleeding risk; there was a 30% increase in bleeding risk for patients with class I obesity but a 93% increase in patients with class III obesity.
“This result suggests that BMI plays a role in bleeding events in patients on warfarin [and] future studies are needed to understand the mechanism by which obesity increases bleeding risk for patients on warfarin, and whether similar risk exists for the novel oral anticoagulants,” said Dr. Adedotun A. Ogunsua of the University of Massachusetts, Worcester, and coauthors.
There were no conflicts of interest disclosed.
Obese patients on warfarin may be at greater risk of bleeding than those of normal weight, according to a study presented at the American Heart Association’s Arteriosclerosis, Thrombosis, and Vascular Biology/Peripheral Vascular Disease Scientific Sessions 2015.
Researchers followed 863 patients attending an anticoagulation clinic for 1 year and found that obesity (body mass index greater than 30 kg/m2) was associated with a statistically significant 84% increase in the risk of major bleeds, such as gastrointestinal, intracerebral, and retroperitoneal hemorrhage.
The study also showed that increasing obesity increased bleeding risk; there was a 30% increase in bleeding risk for patients with class I obesity but a 93% increase in patients with class III obesity.
“This result suggests that BMI plays a role in bleeding events in patients on warfarin [and] future studies are needed to understand the mechanism by which obesity increases bleeding risk for patients on warfarin, and whether similar risk exists for the novel oral anticoagulants,” said Dr. Adedotun A. Ogunsua of the University of Massachusetts, Worcester, and coauthors.
There were no conflicts of interest disclosed.
FROM ATVB/PVD 2015
Key clinical point: Obesity is associated with an increased risk of major bleeding in patients taking warfarin.
Major finding: Obese patients on warfarin had an 84% increased incidence of major bleeding.
Data source: Observational study of 863 patients attending an anticoagulation clinic.
Disclosures: No conflicts of interest were disclosed.