User login
Varied diet not necessarily a high-quality one
A diverse diet is not necessarily a healthy one, according to an advisory issued by the American Heart Association that has instead emphasized the importance of a healthy eating pattern.
Published in the Aug. 9 online edition of Circulation, the science advisory was prompted by emerging evidence that greater dietary diversity may actually be associated with eating more poor quality foods and higher energy intake, especially among middle-aged adults.
Researchers conducted a literature search across 2000-2017 for studies of dietary diversity – defined as the number of different foods or food groups eaten over a given period of time – and dietary quality.
However, they also noted that many studies had significant limitations that contributed to high levels of inconsistency across all studies.
For example, one study in overweight and obese individuals found increasing dietary diversity was associated with a decrease in body mass index but with respect to intakes of only low–energy dense foods. Another study in Chinese adults saw an increase in diversity in the intake of snacks but not grains, vegetables, fruits, meats, or beverages, and this was associated with a 45% greater odds of being overweight, compared with individuals with a lower diversity of snack consumption.
Similarly, an observational study in 2,505 U.S. adults found individuals in the highest quintile of dietary diversity had a 120% greater gain in waist circumference, compared with those in the lowest quintile.
“Associations with dissimilarity scores are consistent with evidence from feeding studies showing that exposure to foods with different characteristics led to increased energy intake, which may partially explain gain in waist circumference over time,” wrote Marcia C. de Oliveira Otto, PhD, from the University of Texas Health Science Centre at Houston and coauthors.
The same was seen in short-term interventional studies, where most showed that having access to a wider variety of foods actually led to an increase in intake, compared with being served only a single food.
For example, one study showed adults offered a second course of sandwiches with different fillings to the first course actually ate 30% more than those served the same option for both courses.
Another study randomized overweight and obese adults to an unlimited number of snack options consumed less than once a day, or any amount of one favored snack option, with all snacks being within a daily caloric goal. This study found that, over the course of 8 weeks, participants offered a variety of snacks ate 25% more servings than those with the one snack type.
The authors suggested that variety amplifies sensory stimulation and decreases satiety.
“Although calorie restriction goals were achieved in both groups, a significant increase in sensory-specific satiety and monotony ratings over time was observed in participants assigned to the one-snack option but not in participants assigned to a variety of snacks,” they wrote.
The relationship between dietary diversity and dietary quality is also complex. Investigators for a cross-sectional study in China found less-than-optimal consumption of the nine food groups in the Chinese dietary guidelines – in particular, fruits, vegetables, fish, and dairy – in diets with higher diversity scores.
“Overall, limited evidence shows no benefit to diet quality or diet healthfulness associated with increased food count or with a more even distribution of energy across foods, whereas findings from one observational study suggest that greater dissimilarity in foods consumed may be inversely associated with a healthy eating pattern.”
In conclusion, the advisory committee said that it was more appropriate to promote a healthy eating pattern, emphasizing intake of plant foods, protein sources, low-fat dairy, vegetable oils, and nuts.
One author declared research funding from the Hass Avocado Board. No other relevant conflicts of interest were declared.
SOURCE: de Oliveira Otto MC et al. Circulation. 2018 Aug. 9. doi: 10.1161/CIR.0000000000000595.
A diverse diet is not necessarily a healthy one, according to an advisory issued by the American Heart Association that has instead emphasized the importance of a healthy eating pattern.
Published in the Aug. 9 online edition of Circulation, the science advisory was prompted by emerging evidence that greater dietary diversity may actually be associated with eating more poor quality foods and higher energy intake, especially among middle-aged adults.
Researchers conducted a literature search across 2000-2017 for studies of dietary diversity – defined as the number of different foods or food groups eaten over a given period of time – and dietary quality.
However, they also noted that many studies had significant limitations that contributed to high levels of inconsistency across all studies.
For example, one study in overweight and obese individuals found increasing dietary diversity was associated with a decrease in body mass index but with respect to intakes of only low–energy dense foods. Another study in Chinese adults saw an increase in diversity in the intake of snacks but not grains, vegetables, fruits, meats, or beverages, and this was associated with a 45% greater odds of being overweight, compared with individuals with a lower diversity of snack consumption.
Similarly, an observational study in 2,505 U.S. adults found individuals in the highest quintile of dietary diversity had a 120% greater gain in waist circumference, compared with those in the lowest quintile.
“Associations with dissimilarity scores are consistent with evidence from feeding studies showing that exposure to foods with different characteristics led to increased energy intake, which may partially explain gain in waist circumference over time,” wrote Marcia C. de Oliveira Otto, PhD, from the University of Texas Health Science Centre at Houston and coauthors.
The same was seen in short-term interventional studies, where most showed that having access to a wider variety of foods actually led to an increase in intake, compared with being served only a single food.
For example, one study showed adults offered a second course of sandwiches with different fillings to the first course actually ate 30% more than those served the same option for both courses.
Another study randomized overweight and obese adults to an unlimited number of snack options consumed less than once a day, or any amount of one favored snack option, with all snacks being within a daily caloric goal. This study found that, over the course of 8 weeks, participants offered a variety of snacks ate 25% more servings than those with the one snack type.
The authors suggested that variety amplifies sensory stimulation and decreases satiety.
“Although calorie restriction goals were achieved in both groups, a significant increase in sensory-specific satiety and monotony ratings over time was observed in participants assigned to the one-snack option but not in participants assigned to a variety of snacks,” they wrote.
The relationship between dietary diversity and dietary quality is also complex. Investigators for a cross-sectional study in China found less-than-optimal consumption of the nine food groups in the Chinese dietary guidelines – in particular, fruits, vegetables, fish, and dairy – in diets with higher diversity scores.
“Overall, limited evidence shows no benefit to diet quality or diet healthfulness associated with increased food count or with a more even distribution of energy across foods, whereas findings from one observational study suggest that greater dissimilarity in foods consumed may be inversely associated with a healthy eating pattern.”
In conclusion, the advisory committee said that it was more appropriate to promote a healthy eating pattern, emphasizing intake of plant foods, protein sources, low-fat dairy, vegetable oils, and nuts.
One author declared research funding from the Hass Avocado Board. No other relevant conflicts of interest were declared.
SOURCE: de Oliveira Otto MC et al. Circulation. 2018 Aug. 9. doi: 10.1161/CIR.0000000000000595.
A diverse diet is not necessarily a healthy one, according to an advisory issued by the American Heart Association that has instead emphasized the importance of a healthy eating pattern.
Published in the Aug. 9 online edition of Circulation, the science advisory was prompted by emerging evidence that greater dietary diversity may actually be associated with eating more poor quality foods and higher energy intake, especially among middle-aged adults.
Researchers conducted a literature search across 2000-2017 for studies of dietary diversity – defined as the number of different foods or food groups eaten over a given period of time – and dietary quality.
However, they also noted that many studies had significant limitations that contributed to high levels of inconsistency across all studies.
For example, one study in overweight and obese individuals found increasing dietary diversity was associated with a decrease in body mass index but with respect to intakes of only low–energy dense foods. Another study in Chinese adults saw an increase in diversity in the intake of snacks but not grains, vegetables, fruits, meats, or beverages, and this was associated with a 45% greater odds of being overweight, compared with individuals with a lower diversity of snack consumption.
Similarly, an observational study in 2,505 U.S. adults found individuals in the highest quintile of dietary diversity had a 120% greater gain in waist circumference, compared with those in the lowest quintile.
“Associations with dissimilarity scores are consistent with evidence from feeding studies showing that exposure to foods with different characteristics led to increased energy intake, which may partially explain gain in waist circumference over time,” wrote Marcia C. de Oliveira Otto, PhD, from the University of Texas Health Science Centre at Houston and coauthors.
The same was seen in short-term interventional studies, where most showed that having access to a wider variety of foods actually led to an increase in intake, compared with being served only a single food.
For example, one study showed adults offered a second course of sandwiches with different fillings to the first course actually ate 30% more than those served the same option for both courses.
Another study randomized overweight and obese adults to an unlimited number of snack options consumed less than once a day, or any amount of one favored snack option, with all snacks being within a daily caloric goal. This study found that, over the course of 8 weeks, participants offered a variety of snacks ate 25% more servings than those with the one snack type.
The authors suggested that variety amplifies sensory stimulation and decreases satiety.
“Although calorie restriction goals were achieved in both groups, a significant increase in sensory-specific satiety and monotony ratings over time was observed in participants assigned to the one-snack option but not in participants assigned to a variety of snacks,” they wrote.
The relationship between dietary diversity and dietary quality is also complex. Investigators for a cross-sectional study in China found less-than-optimal consumption of the nine food groups in the Chinese dietary guidelines – in particular, fruits, vegetables, fish, and dairy – in diets with higher diversity scores.
“Overall, limited evidence shows no benefit to diet quality or diet healthfulness associated with increased food count or with a more even distribution of energy across foods, whereas findings from one observational study suggest that greater dissimilarity in foods consumed may be inversely associated with a healthy eating pattern.”
In conclusion, the advisory committee said that it was more appropriate to promote a healthy eating pattern, emphasizing intake of plant foods, protein sources, low-fat dairy, vegetable oils, and nuts.
One author declared research funding from the Hass Avocado Board. No other relevant conflicts of interest were declared.
SOURCE: de Oliveira Otto MC et al. Circulation. 2018 Aug. 9. doi: 10.1161/CIR.0000000000000595.
FROM CIRCULATION
Key clinical point: Dietary variety may not reflect healthiness or dietary quality.
Major finding: A varied diet is associated with higher intake and poorer quality.
Study details: Literature review and science advisory from the American Heart Association.
Disclosures: One author declared research funding from the Hass Avocado Board. No other relevant conflicts of interest were declared.
Source: de Oliveira Otto MC et al. Circulation. 2018 Aug. 9. doi: 10.1161/CIR.0000000000000595.
Group releases new CLL guidelines
Fludarabine, cyclophosphamide, and rituximab are recommended as initial therapy for fit patients with chronic lymphocytic leukemia (CLL) who do not have TP53 disruption, according to new guidelines from the British Society for Haematology.
The guidelines update the 2012 recommendations on CLL to include “significant” developments in treatment.
The new guidelines were published in the British Journal of Haematology.
Anna H. Schuh, MD, of the University of Oxford in the UK, and her coauthors noted that, while these guidelines apply to treatments available outside clinical trials, wherever possible, patients with CLL should be treated within the clinical trial setting.
While recommending fludarabine, cyclophosphamide, and rituximab as first-line therapy, the guideline authors acknowledged that the combination of bendamustine and rituximab is an acceptable alternative for patients who cannot take the triple therapy because of comorbidities such as advanced age, renal impairment, or issues with marrow capacity.
Similarly, less-fit patients can also be considered for chlorambucil-obinutuzumab or chlorambucil-ofatumumab combinations.
All patients diagnosed with CLL should be tested for TP53 deletions and mutations before each line of therapy, the guideline committee recommended.
TP53 disruption makes chemoimmunotherapy ineffective because of either a deletion of chromosome 17p or a mutation in the TP53 gene. However, there is compelling evidence for the efficacy of ibrutinib in these patients, or idelalisib and rituximab for those with cardiac disease or receiving vitamin K antagonists.
With respect to maintenance therapy, the guidelines noted that this was not routinely recommended in CLL as “it is unclear to what extent the progression-free survival benefit is offset by long-term toxicity.”
Patients who are refractory to chemoimmunotherapy, who have relapsed, or who cannot be retreated with chemoimmunotherapy should be treated with idelalisib plus rituximab or ibrutinib monotherapy, the guidelines suggested.
“Deciding whether ibrutinib or idelalisib with rituximab is most appropriate for an individual patient depends on a range of factors, including toxicity profile and convenience of delivery,” the authors wrote.
However, they noted that the value of adding bendamustine to either option was unclear as research had not shown significant, associated gains in median progression-free survival.
Allogeneic stem cell transplant should be considered as an option for patients who have failed chemotherapy, have a TP53 disruption and have not responded to B-cell receptor signaling pathway inhibitors such as ibrutinib, or have Richter’s transformation.
The guidelines also addressed the issue of autoimmune cytopenias, which occur in 5% to 10% of patients with CLL and can actually precede the diagnosis of CLL in about 9% of cases.
In patients where autoimmune cytopenia is the dominant clinical feature, they should be treated with corticosteroids, intravenous immunoglobulin, or rituximab. However, for patients where the cytopenia is triggered by CLL therapy, the guidelines recommended halting treatment and beginning immunosuppression.
The guideline development was supported by the British Society for Haematology. The UK CLL Forum, which was involved in development as well, is a registered charity that receives funding from a number of pharmaceutical companies.
Fludarabine, cyclophosphamide, and rituximab are recommended as initial therapy for fit patients with chronic lymphocytic leukemia (CLL) who do not have TP53 disruption, according to new guidelines from the British Society for Haematology.
The guidelines update the 2012 recommendations on CLL to include “significant” developments in treatment.
The new guidelines were published in the British Journal of Haematology.
Anna H. Schuh, MD, of the University of Oxford in the UK, and her coauthors noted that, while these guidelines apply to treatments available outside clinical trials, wherever possible, patients with CLL should be treated within the clinical trial setting.
While recommending fludarabine, cyclophosphamide, and rituximab as first-line therapy, the guideline authors acknowledged that the combination of bendamustine and rituximab is an acceptable alternative for patients who cannot take the triple therapy because of comorbidities such as advanced age, renal impairment, or issues with marrow capacity.
Similarly, less-fit patients can also be considered for chlorambucil-obinutuzumab or chlorambucil-ofatumumab combinations.
All patients diagnosed with CLL should be tested for TP53 deletions and mutations before each line of therapy, the guideline committee recommended.
TP53 disruption makes chemoimmunotherapy ineffective because of either a deletion of chromosome 17p or a mutation in the TP53 gene. However, there is compelling evidence for the efficacy of ibrutinib in these patients, or idelalisib and rituximab for those with cardiac disease or receiving vitamin K antagonists.
With respect to maintenance therapy, the guidelines noted that this was not routinely recommended in CLL as “it is unclear to what extent the progression-free survival benefit is offset by long-term toxicity.”
Patients who are refractory to chemoimmunotherapy, who have relapsed, or who cannot be retreated with chemoimmunotherapy should be treated with idelalisib plus rituximab or ibrutinib monotherapy, the guidelines suggested.
“Deciding whether ibrutinib or idelalisib with rituximab is most appropriate for an individual patient depends on a range of factors, including toxicity profile and convenience of delivery,” the authors wrote.
However, they noted that the value of adding bendamustine to either option was unclear as research had not shown significant, associated gains in median progression-free survival.
Allogeneic stem cell transplant should be considered as an option for patients who have failed chemotherapy, have a TP53 disruption and have not responded to B-cell receptor signaling pathway inhibitors such as ibrutinib, or have Richter’s transformation.
The guidelines also addressed the issue of autoimmune cytopenias, which occur in 5% to 10% of patients with CLL and can actually precede the diagnosis of CLL in about 9% of cases.
In patients where autoimmune cytopenia is the dominant clinical feature, they should be treated with corticosteroids, intravenous immunoglobulin, or rituximab. However, for patients where the cytopenia is triggered by CLL therapy, the guidelines recommended halting treatment and beginning immunosuppression.
The guideline development was supported by the British Society for Haematology. The UK CLL Forum, which was involved in development as well, is a registered charity that receives funding from a number of pharmaceutical companies.
Fludarabine, cyclophosphamide, and rituximab are recommended as initial therapy for fit patients with chronic lymphocytic leukemia (CLL) who do not have TP53 disruption, according to new guidelines from the British Society for Haematology.
The guidelines update the 2012 recommendations on CLL to include “significant” developments in treatment.
The new guidelines were published in the British Journal of Haematology.
Anna H. Schuh, MD, of the University of Oxford in the UK, and her coauthors noted that, while these guidelines apply to treatments available outside clinical trials, wherever possible, patients with CLL should be treated within the clinical trial setting.
While recommending fludarabine, cyclophosphamide, and rituximab as first-line therapy, the guideline authors acknowledged that the combination of bendamustine and rituximab is an acceptable alternative for patients who cannot take the triple therapy because of comorbidities such as advanced age, renal impairment, or issues with marrow capacity.
Similarly, less-fit patients can also be considered for chlorambucil-obinutuzumab or chlorambucil-ofatumumab combinations.
All patients diagnosed with CLL should be tested for TP53 deletions and mutations before each line of therapy, the guideline committee recommended.
TP53 disruption makes chemoimmunotherapy ineffective because of either a deletion of chromosome 17p or a mutation in the TP53 gene. However, there is compelling evidence for the efficacy of ibrutinib in these patients, or idelalisib and rituximab for those with cardiac disease or receiving vitamin K antagonists.
With respect to maintenance therapy, the guidelines noted that this was not routinely recommended in CLL as “it is unclear to what extent the progression-free survival benefit is offset by long-term toxicity.”
Patients who are refractory to chemoimmunotherapy, who have relapsed, or who cannot be retreated with chemoimmunotherapy should be treated with idelalisib plus rituximab or ibrutinib monotherapy, the guidelines suggested.
“Deciding whether ibrutinib or idelalisib with rituximab is most appropriate for an individual patient depends on a range of factors, including toxicity profile and convenience of delivery,” the authors wrote.
However, they noted that the value of adding bendamustine to either option was unclear as research had not shown significant, associated gains in median progression-free survival.
Allogeneic stem cell transplant should be considered as an option for patients who have failed chemotherapy, have a TP53 disruption and have not responded to B-cell receptor signaling pathway inhibitors such as ibrutinib, or have Richter’s transformation.
The guidelines also addressed the issue of autoimmune cytopenias, which occur in 5% to 10% of patients with CLL and can actually precede the diagnosis of CLL in about 9% of cases.
In patients where autoimmune cytopenia is the dominant clinical feature, they should be treated with corticosteroids, intravenous immunoglobulin, or rituximab. However, for patients where the cytopenia is triggered by CLL therapy, the guidelines recommended halting treatment and beginning immunosuppression.
The guideline development was supported by the British Society for Haematology. The UK CLL Forum, which was involved in development as well, is a registered charity that receives funding from a number of pharmaceutical companies.
Concurrent stimulant and opioid use ‘common’ in adult ADHD
A significant number of adults with attention-deficit/hyperactivity disorder are concurrently using stimulants and opioids, highlighting a need for research into the risks and benefits of long-term coadministration of these medications.
Researchers reported the results of a cross-sectional study using Medicaid Analytic eXtract data from 66,406 adults with ADHD across 29 states.
Overall, 32.7% used stimulants, and 5.4% had used both stimulants and opioids long term, defined as at least 30 consecutive days of use. Long-term opioid use was more common among adults who used stimulants, compared with those who did not use stimulants (16.5% vs. 13%), wrote Yu-Jung “Jenny” Wei, PhD, and her associates. The report was published in JAMA Network Open.
Most of the adults who used both stimulants and opioids concurrently long term were using short-acting opioids (81.8%) rather than long-acting (20.6%). However, nearly one-quarter (23.2%) had prescriptions for both long- and short-acting opioids.
The researchers noted a significant 12% increase in the prevalence of concurrent use of stimulants and opioids from 1999 to 2010.
“Our findings suggest that long-term concurrent use of stimulants and opioids has become an increasingly common practice among adult patients with ADHD,” wrote Dr. Wei, of the College of Pharmacy at the University of Florida, Gainesville, and her associates.
The researchers also found an increase in these trends with age: Adults in their 30s showed a 7% higher prevalence of long-term concurrent use, compared with adults in their 20s. In addition, those aged 41-50 years had a 14% higher prevalence, and those aged 51-64 years had a 17% higher prevalence.
Adults with pain had a 10% higher prevalence of concurrent use, while other People with schizophrenia appeared to have a 5% lower incidence of concurrent use.
“Although the concurrent use of stimulants and opioids may initially have been prompted by ADHD symptoms and comorbid chronic pain, continued use of opioids alone or combined with central nervous system stimulants may result in drug dependence and other adverse effects (e.g., overdose) because of the high potential for abuse and misuse,” the authors wrote. “Identifying these high-risk patients allows for early intervention and may reduce the number of adverse events associated with the long-term use of these medications.”
Among the limitations cited is that only prescription medications filled and reimbursed by Medicaid were included in the analysis. “Considering that opioid prescription fills are commonly paid out of pocket, our reported prevalence of concurrent stimulant-opioid use may be too low,” they wrote.
The authors reported no conflicts of interest. One author was supported by an award from the National Institute on Aging.
SOURCE: Wei Y-J et al. JAMA Network Open. 2018. Aug 10. doi: 10.1001/jamanetworkopen.2018.1152.
A significant number of adults with attention-deficit/hyperactivity disorder are concurrently using stimulants and opioids, highlighting a need for research into the risks and benefits of long-term coadministration of these medications.
Researchers reported the results of a cross-sectional study using Medicaid Analytic eXtract data from 66,406 adults with ADHD across 29 states.
Overall, 32.7% used stimulants, and 5.4% had used both stimulants and opioids long term, defined as at least 30 consecutive days of use. Long-term opioid use was more common among adults who used stimulants, compared with those who did not use stimulants (16.5% vs. 13%), wrote Yu-Jung “Jenny” Wei, PhD, and her associates. The report was published in JAMA Network Open.
Most of the adults who used both stimulants and opioids concurrently long term were using short-acting opioids (81.8%) rather than long-acting (20.6%). However, nearly one-quarter (23.2%) had prescriptions for both long- and short-acting opioids.
The researchers noted a significant 12% increase in the prevalence of concurrent use of stimulants and opioids from 1999 to 2010.
“Our findings suggest that long-term concurrent use of stimulants and opioids has become an increasingly common practice among adult patients with ADHD,” wrote Dr. Wei, of the College of Pharmacy at the University of Florida, Gainesville, and her associates.
The researchers also found an increase in these trends with age: Adults in their 30s showed a 7% higher prevalence of long-term concurrent use, compared with adults in their 20s. In addition, those aged 41-50 years had a 14% higher prevalence, and those aged 51-64 years had a 17% higher prevalence.
Adults with pain had a 10% higher prevalence of concurrent use, while other People with schizophrenia appeared to have a 5% lower incidence of concurrent use.
“Although the concurrent use of stimulants and opioids may initially have been prompted by ADHD symptoms and comorbid chronic pain, continued use of opioids alone or combined with central nervous system stimulants may result in drug dependence and other adverse effects (e.g., overdose) because of the high potential for abuse and misuse,” the authors wrote. “Identifying these high-risk patients allows for early intervention and may reduce the number of adverse events associated with the long-term use of these medications.”
Among the limitations cited is that only prescription medications filled and reimbursed by Medicaid were included in the analysis. “Considering that opioid prescription fills are commonly paid out of pocket, our reported prevalence of concurrent stimulant-opioid use may be too low,” they wrote.
The authors reported no conflicts of interest. One author was supported by an award from the National Institute on Aging.
SOURCE: Wei Y-J et al. JAMA Network Open. 2018. Aug 10. doi: 10.1001/jamanetworkopen.2018.1152.
A significant number of adults with attention-deficit/hyperactivity disorder are concurrently using stimulants and opioids, highlighting a need for research into the risks and benefits of long-term coadministration of these medications.
Researchers reported the results of a cross-sectional study using Medicaid Analytic eXtract data from 66,406 adults with ADHD across 29 states.
Overall, 32.7% used stimulants, and 5.4% had used both stimulants and opioids long term, defined as at least 30 consecutive days of use. Long-term opioid use was more common among adults who used stimulants, compared with those who did not use stimulants (16.5% vs. 13%), wrote Yu-Jung “Jenny” Wei, PhD, and her associates. The report was published in JAMA Network Open.
Most of the adults who used both stimulants and opioids concurrently long term were using short-acting opioids (81.8%) rather than long-acting (20.6%). However, nearly one-quarter (23.2%) had prescriptions for both long- and short-acting opioids.
The researchers noted a significant 12% increase in the prevalence of concurrent use of stimulants and opioids from 1999 to 2010.
“Our findings suggest that long-term concurrent use of stimulants and opioids has become an increasingly common practice among adult patients with ADHD,” wrote Dr. Wei, of the College of Pharmacy at the University of Florida, Gainesville, and her associates.
The researchers also found an increase in these trends with age: Adults in their 30s showed a 7% higher prevalence of long-term concurrent use, compared with adults in their 20s. In addition, those aged 41-50 years had a 14% higher prevalence, and those aged 51-64 years had a 17% higher prevalence.
Adults with pain had a 10% higher prevalence of concurrent use, while other People with schizophrenia appeared to have a 5% lower incidence of concurrent use.
“Although the concurrent use of stimulants and opioids may initially have been prompted by ADHD symptoms and comorbid chronic pain, continued use of opioids alone or combined with central nervous system stimulants may result in drug dependence and other adverse effects (e.g., overdose) because of the high potential for abuse and misuse,” the authors wrote. “Identifying these high-risk patients allows for early intervention and may reduce the number of adverse events associated with the long-term use of these medications.”
Among the limitations cited is that only prescription medications filled and reimbursed by Medicaid were included in the analysis. “Considering that opioid prescription fills are commonly paid out of pocket, our reported prevalence of concurrent stimulant-opioid use may be too low,” they wrote.
The authors reported no conflicts of interest. One author was supported by an award from the National Institute on Aging.
SOURCE: Wei Y-J et al. JAMA Network Open. 2018. Aug 10. doi: 10.1001/jamanetworkopen.2018.1152.
FROM JAMA NETWORK OPEN
Key clinical point: Identifying high-risk patients “allows for early intervention and may reduce the number of adverse events associated with the long-term use.”
Major finding: About 5% of adults with ADHD are on both opioids and stimulants long term.
Study details: Cross-sectional study of 66,406 adults with ADHD.
Disclosures: The authors reported no conflicts of interest. One author was supported by an award from the National Institute on Aging.
Source: Wei Y-J et al. JAMA Network Open. 2018. Aug 10. doi: 10.1001/jamanetworkopen.2018.1152.
EGFR-mutant NSCLC may still respond to PD-1 blockade
PD-1 blockade should still be considered as a second-line approach in patients with EGFR-mutant non–small-cell lung cancer (NSCLC), as some may still respond to this therapy, investigators reported.
The researchers described a patient who was diagnosed with epidermal growth factor receptor (EGFR)-mutant NSCLC but who also showed high expression of PD-L1.
The 62-year-old woman first presented with a lung nodule and metastasis in the right hip, which were EGFR-mutant and had a PD-L1 tumor proportion score of 90%. She was treated with erlotinib – an EGFR tyrosine kinase inhibitor – and radiotherapy, according to the report, published in Annals of Oncology.
Two months later, she developed multiple new metastases in the chest wall. These also showed high PD-L1 expression, but no EGFR mutations, so she was treated with pembrolizumab as second-line therapy.
After the first course of treatment with pembrolizumab, the hip pain improved dramatically, and after three cycles the primary and metastatic lesions disappeared completely.
“Immunofluorescent analysis of both right ilium and chest wall lesions with an anti-EGFR antibody specific to Ex.19 del and an anti-PD-L1 antibody revealed the presence of distinct heterogeneity of EGFR-mutant clones and PD-L1 highly-expressing clones,” wrote Kei Kunimasa, MD, of Osaka International Cancer Institute, and coauthors.
Current clinical trials suggested that patients with EGFR mutations are likely to have a poor response to PD-1 blockade therapies for NSCLC, and the authors agreed that patients with EGFR mutations should be treated with EGFR tyrosine kinase inhibitors as first-line therapy.
“However, the present case suggested that shorter PFS in EGFR-TKI treatment for EGFR-mutant NSCLC patients with high PD-L1 TPS and without acquired T790M mutation could encourage us to try PD-1 blockade therapy as second line therapy,” they wrote.
The investigators suggested that certain clinical factors could identify patients with EGFR-mutant NSCLC who might respond well to PD-1 blockade, such as progression-free survival on EGFR tyrosine kinase inhibitor treatment, other targetable resistant mutations, tumor mutation burden, and PD-L1 expression.
Two authors reported grants and personal fees from pharmaceutical companies outside the submitted work. No other conflicts of interest were declared.
SOURCE: Kunimasa K et al. Ann Oncol, 2018 Aug 7. doi: 10.1093/annonc/mdy312.
PD-1 blockade should still be considered as a second-line approach in patients with EGFR-mutant non–small-cell lung cancer (NSCLC), as some may still respond to this therapy, investigators reported.
The researchers described a patient who was diagnosed with epidermal growth factor receptor (EGFR)-mutant NSCLC but who also showed high expression of PD-L1.
The 62-year-old woman first presented with a lung nodule and metastasis in the right hip, which were EGFR-mutant and had a PD-L1 tumor proportion score of 90%. She was treated with erlotinib – an EGFR tyrosine kinase inhibitor – and radiotherapy, according to the report, published in Annals of Oncology.
Two months later, she developed multiple new metastases in the chest wall. These also showed high PD-L1 expression, but no EGFR mutations, so she was treated with pembrolizumab as second-line therapy.
After the first course of treatment with pembrolizumab, the hip pain improved dramatically, and after three cycles the primary and metastatic lesions disappeared completely.
“Immunofluorescent analysis of both right ilium and chest wall lesions with an anti-EGFR antibody specific to Ex.19 del and an anti-PD-L1 antibody revealed the presence of distinct heterogeneity of EGFR-mutant clones and PD-L1 highly-expressing clones,” wrote Kei Kunimasa, MD, of Osaka International Cancer Institute, and coauthors.
Current clinical trials suggested that patients with EGFR mutations are likely to have a poor response to PD-1 blockade therapies for NSCLC, and the authors agreed that patients with EGFR mutations should be treated with EGFR tyrosine kinase inhibitors as first-line therapy.
“However, the present case suggested that shorter PFS in EGFR-TKI treatment for EGFR-mutant NSCLC patients with high PD-L1 TPS and without acquired T790M mutation could encourage us to try PD-1 blockade therapy as second line therapy,” they wrote.
The investigators suggested that certain clinical factors could identify patients with EGFR-mutant NSCLC who might respond well to PD-1 blockade, such as progression-free survival on EGFR tyrosine kinase inhibitor treatment, other targetable resistant mutations, tumor mutation burden, and PD-L1 expression.
Two authors reported grants and personal fees from pharmaceutical companies outside the submitted work. No other conflicts of interest were declared.
SOURCE: Kunimasa K et al. Ann Oncol, 2018 Aug 7. doi: 10.1093/annonc/mdy312.
PD-1 blockade should still be considered as a second-line approach in patients with EGFR-mutant non–small-cell lung cancer (NSCLC), as some may still respond to this therapy, investigators reported.
The researchers described a patient who was diagnosed with epidermal growth factor receptor (EGFR)-mutant NSCLC but who also showed high expression of PD-L1.
The 62-year-old woman first presented with a lung nodule and metastasis in the right hip, which were EGFR-mutant and had a PD-L1 tumor proportion score of 90%. She was treated with erlotinib – an EGFR tyrosine kinase inhibitor – and radiotherapy, according to the report, published in Annals of Oncology.
Two months later, she developed multiple new metastases in the chest wall. These also showed high PD-L1 expression, but no EGFR mutations, so she was treated with pembrolizumab as second-line therapy.
After the first course of treatment with pembrolizumab, the hip pain improved dramatically, and after three cycles the primary and metastatic lesions disappeared completely.
“Immunofluorescent analysis of both right ilium and chest wall lesions with an anti-EGFR antibody specific to Ex.19 del and an anti-PD-L1 antibody revealed the presence of distinct heterogeneity of EGFR-mutant clones and PD-L1 highly-expressing clones,” wrote Kei Kunimasa, MD, of Osaka International Cancer Institute, and coauthors.
Current clinical trials suggested that patients with EGFR mutations are likely to have a poor response to PD-1 blockade therapies for NSCLC, and the authors agreed that patients with EGFR mutations should be treated with EGFR tyrosine kinase inhibitors as first-line therapy.
“However, the present case suggested that shorter PFS in EGFR-TKI treatment for EGFR-mutant NSCLC patients with high PD-L1 TPS and without acquired T790M mutation could encourage us to try PD-1 blockade therapy as second line therapy,” they wrote.
The investigators suggested that certain clinical factors could identify patients with EGFR-mutant NSCLC who might respond well to PD-1 blockade, such as progression-free survival on EGFR tyrosine kinase inhibitor treatment, other targetable resistant mutations, tumor mutation burden, and PD-L1 expression.
Two authors reported grants and personal fees from pharmaceutical companies outside the submitted work. No other conflicts of interest were declared.
SOURCE: Kunimasa K et al. Ann Oncol, 2018 Aug 7. doi: 10.1093/annonc/mdy312.
FROM ANNALS OF ONCOLOGY
Key clinical point: Consider PD-1 blockade in EGFR-mutant non–small-cell lung cancer with high PD-L1 expression.Major finding: EGFR-mutant non–small-cell lung cancer may still respond to PD-1 blockade.
Study details: Case study.
Disclosures: Two authors reported grants and personal fees from pharmaceutical companies outside the submitted work. No other conflicts of interest were declared.
Source: Kunimasa K et al. Ann Oncol. 2018 Aug 7. doi: 10.1093/annonc/mdy312.
Increased B-cell lymphoma risk with JAK1/2 inhibitors
Patients with myeloproliferative neoplasms treated with Janus-kinase (JAK) 1/2 inhibitors may be at significantly increased risk of aggressive B cell non-Hodgkin lymphomas, according to a study published in Blood.
A retrospective cohort study of 626 Viennese patients with myeloproliferative neoplasms – 69 of whom were treated with JAK1/2 inhibitors – found that 4 of the 69 patients (5.8%) developed aggressive B-cell lymphoma, compared with just 2 patients (0.36%) in the rest of the group. This represented a significant, 16-fold higher risk of aggressive B cell lymphoma associated with JAK1/2 inhibitor therapy (P = .0017).
The lymphoma was diagnosed within 13-35 months of starting JAK1/2 inhibitors. In three patients, the disease was in the bone marrow and peripheral blood, one patient had it in mammary tissue, and another had it in mucosal tissue. All four lymphomas showed positive MYC and p53 staining.
All four patients had been treated with ruxolitinib, one was also treated with fedratinib, and three of the four had been pretreated with alkylating agents.
Meanwhile, a second retrospective cohort study in Paris of 929 patients with myeloproliferative neoplasms, reported in the same paper, found that 3.51% of those treated with ruxolitinib developed lymphoma, compared with 0.23% of conventionally-treated patients.
Using archived bone marrow samples from 54 of the 69 patients treated with JAK1/2 inhibitors, researchers discovered that 15.9% of them – including three of the B-cell lymphoma patients (the fourth was not tested) – had a preexisting B cell clone. This was present as early as 47-70 months before the lymphoma diagnosis.
“In patients, the clonal B-cell population was present as long as 6 years before overt lymphoma and preceded JAK1/2 inhibition which offers the opportunity to determine patients at risk,” wrote Edit Porpaczy, MD, of the Comprehensive Cancer Center at the Medical University of Vienna, and her coauthors. “Targeted inhibition of JAK-STAT signaling appears to be required to trigger the appearance of the B-cell clone as other treatments eliminating the myeloid cell load in men do not exert a comparable effect.”
In the Viennese cohort, three of the lymphomas were aggressive CD19+ B-cell type, and the fourth was a nonspecified high-grade B-cell lymphoma.
Researchers also looked at the effects of JAK1/2 inhibition in STAT1-/- mice, and found that two-thirds developed a spontaneous myeloid hyperplasia with the concomitant presence of aberrant B-cells.
“Upon STAT1-deficiency myeloid hyperplasia is paralleled by the occurrence of a malignant B-cell clone, which evolves into disease upon bone-marrow transplantation and gives rise to a leukemic lymphoma phenotype,” the authors wrote.
The study was supported by the Austrian Science Fund, the Anniversary Fund of the Austrian National Bank and the WWTF Precision Medicine Program. Several authors reported support, funding or advisory board positions with the pharmaceutical industry.
SOURCE: Porpaczy E et al. Blood. 2018 Jun 14. doi: 10.1182/blood-2017-10-810739.
Patients with myeloproliferative neoplasms treated with Janus-kinase (JAK) 1/2 inhibitors may be at significantly increased risk of aggressive B cell non-Hodgkin lymphomas, according to a study published in Blood.
A retrospective cohort study of 626 Viennese patients with myeloproliferative neoplasms – 69 of whom were treated with JAK1/2 inhibitors – found that 4 of the 69 patients (5.8%) developed aggressive B-cell lymphoma, compared with just 2 patients (0.36%) in the rest of the group. This represented a significant, 16-fold higher risk of aggressive B cell lymphoma associated with JAK1/2 inhibitor therapy (P = .0017).
The lymphoma was diagnosed within 13-35 months of starting JAK1/2 inhibitors. In three patients, the disease was in the bone marrow and peripheral blood, one patient had it in mammary tissue, and another had it in mucosal tissue. All four lymphomas showed positive MYC and p53 staining.
All four patients had been treated with ruxolitinib, one was also treated with fedratinib, and three of the four had been pretreated with alkylating agents.
Meanwhile, a second retrospective cohort study in Paris of 929 patients with myeloproliferative neoplasms, reported in the same paper, found that 3.51% of those treated with ruxolitinib developed lymphoma, compared with 0.23% of conventionally-treated patients.
Using archived bone marrow samples from 54 of the 69 patients treated with JAK1/2 inhibitors, researchers discovered that 15.9% of them – including three of the B-cell lymphoma patients (the fourth was not tested) – had a preexisting B cell clone. This was present as early as 47-70 months before the lymphoma diagnosis.
“In patients, the clonal B-cell population was present as long as 6 years before overt lymphoma and preceded JAK1/2 inhibition which offers the opportunity to determine patients at risk,” wrote Edit Porpaczy, MD, of the Comprehensive Cancer Center at the Medical University of Vienna, and her coauthors. “Targeted inhibition of JAK-STAT signaling appears to be required to trigger the appearance of the B-cell clone as other treatments eliminating the myeloid cell load in men do not exert a comparable effect.”
In the Viennese cohort, three of the lymphomas were aggressive CD19+ B-cell type, and the fourth was a nonspecified high-grade B-cell lymphoma.
Researchers also looked at the effects of JAK1/2 inhibition in STAT1-/- mice, and found that two-thirds developed a spontaneous myeloid hyperplasia with the concomitant presence of aberrant B-cells.
“Upon STAT1-deficiency myeloid hyperplasia is paralleled by the occurrence of a malignant B-cell clone, which evolves into disease upon bone-marrow transplantation and gives rise to a leukemic lymphoma phenotype,” the authors wrote.
The study was supported by the Austrian Science Fund, the Anniversary Fund of the Austrian National Bank and the WWTF Precision Medicine Program. Several authors reported support, funding or advisory board positions with the pharmaceutical industry.
SOURCE: Porpaczy E et al. Blood. 2018 Jun 14. doi: 10.1182/blood-2017-10-810739.
Patients with myeloproliferative neoplasms treated with Janus-kinase (JAK) 1/2 inhibitors may be at significantly increased risk of aggressive B cell non-Hodgkin lymphomas, according to a study published in Blood.
A retrospective cohort study of 626 Viennese patients with myeloproliferative neoplasms – 69 of whom were treated with JAK1/2 inhibitors – found that 4 of the 69 patients (5.8%) developed aggressive B-cell lymphoma, compared with just 2 patients (0.36%) in the rest of the group. This represented a significant, 16-fold higher risk of aggressive B cell lymphoma associated with JAK1/2 inhibitor therapy (P = .0017).
The lymphoma was diagnosed within 13-35 months of starting JAK1/2 inhibitors. In three patients, the disease was in the bone marrow and peripheral blood, one patient had it in mammary tissue, and another had it in mucosal tissue. All four lymphomas showed positive MYC and p53 staining.
All four patients had been treated with ruxolitinib, one was also treated with fedratinib, and three of the four had been pretreated with alkylating agents.
Meanwhile, a second retrospective cohort study in Paris of 929 patients with myeloproliferative neoplasms, reported in the same paper, found that 3.51% of those treated with ruxolitinib developed lymphoma, compared with 0.23% of conventionally-treated patients.
Using archived bone marrow samples from 54 of the 69 patients treated with JAK1/2 inhibitors, researchers discovered that 15.9% of them – including three of the B-cell lymphoma patients (the fourth was not tested) – had a preexisting B cell clone. This was present as early as 47-70 months before the lymphoma diagnosis.
“In patients, the clonal B-cell population was present as long as 6 years before overt lymphoma and preceded JAK1/2 inhibition which offers the opportunity to determine patients at risk,” wrote Edit Porpaczy, MD, of the Comprehensive Cancer Center at the Medical University of Vienna, and her coauthors. “Targeted inhibition of JAK-STAT signaling appears to be required to trigger the appearance of the B-cell clone as other treatments eliminating the myeloid cell load in men do not exert a comparable effect.”
In the Viennese cohort, three of the lymphomas were aggressive CD19+ B-cell type, and the fourth was a nonspecified high-grade B-cell lymphoma.
Researchers also looked at the effects of JAK1/2 inhibition in STAT1-/- mice, and found that two-thirds developed a spontaneous myeloid hyperplasia with the concomitant presence of aberrant B-cells.
“Upon STAT1-deficiency myeloid hyperplasia is paralleled by the occurrence of a malignant B-cell clone, which evolves into disease upon bone-marrow transplantation and gives rise to a leukemic lymphoma phenotype,” the authors wrote.
The study was supported by the Austrian Science Fund, the Anniversary Fund of the Austrian National Bank and the WWTF Precision Medicine Program. Several authors reported support, funding or advisory board positions with the pharmaceutical industry.
SOURCE: Porpaczy E et al. Blood. 2018 Jun 14. doi: 10.1182/blood-2017-10-810739.
FROM BLOOD
Key clinical point:
Major finding: Patients with myeloproliferative neoplasms treated with JAK1/2 inhibitors have a 16-fold higher incidence of lymphoma.
Study details: A retrospective cohort study of 626 patients with myeloproliferative neoplasms.
Disclosures: The study was supported by the Austrian Science Fund, the Anniversary Fund of the Austrian National Bank, and the WWTF Precision Medicine Program. Several authors reported support, funding, or advisory board positions with the pharmaceutical industry.
Source: Porpaczy E et al. Blood. 2018 Jun 14. doi: 10.1182/blood-2017-10-810739.
Score predicts 3-month mortality in malignant pleural effusion
A score incorporating clinical and biological markers could help predict the risk of death in patients with malignant pleural effusion and their likelihood of responding well to pleurodesis, according to a study published online in The Lancet Oncology.
The researchers used five separate and independent datasets from three previous multicenter randomized controlled trials – TIME-1, TIME-2, and TIME-3 – to identify 17 candidates for biomarkers of survival at 3 months and 7 candidates for biomarkers of pleurodesis success at 3 months.
They combined these with clinical, radiological, and biological variables to develop the clinical PROMISE model, which included relative protein expression of tissue inhibitor of metalloproteinases 1, platelet-derived growth factor, vascular endothelial growth factor, cadherin 1 and interleukin 4. The pleurodesis dataset included tumour necrosis factor alpha, TNF-beta, interleukin 6, and fibroblast growth factor 2.
The model was then externally validated using complete case data from 162 individuals with malignant pleural effusion, just over one-third of whom had died before 3 months.
The researchers also developed a biological model based on prognostic factors for survival, including use of previous chemotherapy and radiotherapy, baseline ECOG performance status, cancer type, hemoglobin, and white blood cell count.
The researchers found that the PROMISE scores showed “good discrimination,” and based on that, they came up with four categories – A, B, C, and D, representing less than 25% risk, 25%-49% risk, 50%-74% risk, and 75% or more risk of death by 3 months. However, none of the biomarkers associated with pleurodesis outcomes could be validated.
“All parameters included in the PROMISE score are independently associated with survival, and thus the identified markers permit some speculation as to their biological role in survival in malignant pleural effusion,” wrote Ioannis Psallidas, MD, of the Oxford Centre for Respiratory Medicine at Oxford (England) University Hospital, and his coauthors.
For example, they noted that patients who had previously been treated with chemotherapy and radiotherapy may have a poorer prognosis because of the development of more aggressive cancer after treatment. Similarly, white blood cell counts and C-reactive protein are markers of inflammation and are linked to poor tumor-specific immunity.
“Evidence suggests that clinical judgment alone is imprecise for estimation of patient survival, highlighted by the fact that physicians are ineffective in excluding study participants with poor prognosis from large clinical trials,” the authors wrote. “Although future confirmatory studies are required, the PROMISE score (either clinical or biological) could potentially be used in everyday clinical practice as a method to improve patient management and reduce associated health-care costs, and as an enrichment strategy for future clinical trials.”
The study was supported by the Oxford Respiratory Trials Unit, Medical Research Funding-University of Oxford, Slater & Gordon Research Fund, and Oxfordshire Health Services Research Committee Research Grants. No conflicts of interest were declared.
SOURCE: Psallidas I et al. Lancet Oncol. 2018 Jun 13. doi: 10.1016/S1470-2045(18)30294-8.
Around 50% of patients with a malignancy will present with dyspnea – mostly those with advanced disease. The scientific literature suggests pleurodesis is effective in around 70% of those who undergo the treatment but in real life the proportion can be much lower.
While this study is to be congratulated for introducing a new scoring system to predict survival in malignant pleural effusion, the model did not seem to be able to predict the success of pleurodesis.
The search for predictive markers for successful pleurodesis was one of the most interesting goals of this study, but despite a rigorous analysis of many pleural fluid samples, these markers proved elusive. It may be that identifying even one predictive marker in such a heterogeneous group of primary malignancies is going to be a challenge.
Paul Baas, MD, and Sjaak Burgers, MD, are in the department of thoracic oncology at The Netherlands Cancer Institute in Amsterdam. These comments are taken from their editorial (Lancet Oncol. 2018 Jun 12. doi: 10.1016/S1470-2045(18)30361-9). No conflicts of interest were declared.
Around 50% of patients with a malignancy will present with dyspnea – mostly those with advanced disease. The scientific literature suggests pleurodesis is effective in around 70% of those who undergo the treatment but in real life the proportion can be much lower.
While this study is to be congratulated for introducing a new scoring system to predict survival in malignant pleural effusion, the model did not seem to be able to predict the success of pleurodesis.
The search for predictive markers for successful pleurodesis was one of the most interesting goals of this study, but despite a rigorous analysis of many pleural fluid samples, these markers proved elusive. It may be that identifying even one predictive marker in such a heterogeneous group of primary malignancies is going to be a challenge.
Paul Baas, MD, and Sjaak Burgers, MD, are in the department of thoracic oncology at The Netherlands Cancer Institute in Amsterdam. These comments are taken from their editorial (Lancet Oncol. 2018 Jun 12. doi: 10.1016/S1470-2045(18)30361-9). No conflicts of interest were declared.
Around 50% of patients with a malignancy will present with dyspnea – mostly those with advanced disease. The scientific literature suggests pleurodesis is effective in around 70% of those who undergo the treatment but in real life the proportion can be much lower.
While this study is to be congratulated for introducing a new scoring system to predict survival in malignant pleural effusion, the model did not seem to be able to predict the success of pleurodesis.
The search for predictive markers for successful pleurodesis was one of the most interesting goals of this study, but despite a rigorous analysis of many pleural fluid samples, these markers proved elusive. It may be that identifying even one predictive marker in such a heterogeneous group of primary malignancies is going to be a challenge.
Paul Baas, MD, and Sjaak Burgers, MD, are in the department of thoracic oncology at The Netherlands Cancer Institute in Amsterdam. These comments are taken from their editorial (Lancet Oncol. 2018 Jun 12. doi: 10.1016/S1470-2045(18)30361-9). No conflicts of interest were declared.
A score incorporating clinical and biological markers could help predict the risk of death in patients with malignant pleural effusion and their likelihood of responding well to pleurodesis, according to a study published online in The Lancet Oncology.
The researchers used five separate and independent datasets from three previous multicenter randomized controlled trials – TIME-1, TIME-2, and TIME-3 – to identify 17 candidates for biomarkers of survival at 3 months and 7 candidates for biomarkers of pleurodesis success at 3 months.
They combined these with clinical, radiological, and biological variables to develop the clinical PROMISE model, which included relative protein expression of tissue inhibitor of metalloproteinases 1, platelet-derived growth factor, vascular endothelial growth factor, cadherin 1 and interleukin 4. The pleurodesis dataset included tumour necrosis factor alpha, TNF-beta, interleukin 6, and fibroblast growth factor 2.
The model was then externally validated using complete case data from 162 individuals with malignant pleural effusion, just over one-third of whom had died before 3 months.
The researchers also developed a biological model based on prognostic factors for survival, including use of previous chemotherapy and radiotherapy, baseline ECOG performance status, cancer type, hemoglobin, and white blood cell count.
The researchers found that the PROMISE scores showed “good discrimination,” and based on that, they came up with four categories – A, B, C, and D, representing less than 25% risk, 25%-49% risk, 50%-74% risk, and 75% or more risk of death by 3 months. However, none of the biomarkers associated with pleurodesis outcomes could be validated.
“All parameters included in the PROMISE score are independently associated with survival, and thus the identified markers permit some speculation as to their biological role in survival in malignant pleural effusion,” wrote Ioannis Psallidas, MD, of the Oxford Centre for Respiratory Medicine at Oxford (England) University Hospital, and his coauthors.
For example, they noted that patients who had previously been treated with chemotherapy and radiotherapy may have a poorer prognosis because of the development of more aggressive cancer after treatment. Similarly, white blood cell counts and C-reactive protein are markers of inflammation and are linked to poor tumor-specific immunity.
“Evidence suggests that clinical judgment alone is imprecise for estimation of patient survival, highlighted by the fact that physicians are ineffective in excluding study participants with poor prognosis from large clinical trials,” the authors wrote. “Although future confirmatory studies are required, the PROMISE score (either clinical or biological) could potentially be used in everyday clinical practice as a method to improve patient management and reduce associated health-care costs, and as an enrichment strategy for future clinical trials.”
The study was supported by the Oxford Respiratory Trials Unit, Medical Research Funding-University of Oxford, Slater & Gordon Research Fund, and Oxfordshire Health Services Research Committee Research Grants. No conflicts of interest were declared.
SOURCE: Psallidas I et al. Lancet Oncol. 2018 Jun 13. doi: 10.1016/S1470-2045(18)30294-8.
A score incorporating clinical and biological markers could help predict the risk of death in patients with malignant pleural effusion and their likelihood of responding well to pleurodesis, according to a study published online in The Lancet Oncology.
The researchers used five separate and independent datasets from three previous multicenter randomized controlled trials – TIME-1, TIME-2, and TIME-3 – to identify 17 candidates for biomarkers of survival at 3 months and 7 candidates for biomarkers of pleurodesis success at 3 months.
They combined these with clinical, radiological, and biological variables to develop the clinical PROMISE model, which included relative protein expression of tissue inhibitor of metalloproteinases 1, platelet-derived growth factor, vascular endothelial growth factor, cadherin 1 and interleukin 4. The pleurodesis dataset included tumour necrosis factor alpha, TNF-beta, interleukin 6, and fibroblast growth factor 2.
The model was then externally validated using complete case data from 162 individuals with malignant pleural effusion, just over one-third of whom had died before 3 months.
The researchers also developed a biological model based on prognostic factors for survival, including use of previous chemotherapy and radiotherapy, baseline ECOG performance status, cancer type, hemoglobin, and white blood cell count.
The researchers found that the PROMISE scores showed “good discrimination,” and based on that, they came up with four categories – A, B, C, and D, representing less than 25% risk, 25%-49% risk, 50%-74% risk, and 75% or more risk of death by 3 months. However, none of the biomarkers associated with pleurodesis outcomes could be validated.
“All parameters included in the PROMISE score are independently associated with survival, and thus the identified markers permit some speculation as to their biological role in survival in malignant pleural effusion,” wrote Ioannis Psallidas, MD, of the Oxford Centre for Respiratory Medicine at Oxford (England) University Hospital, and his coauthors.
For example, they noted that patients who had previously been treated with chemotherapy and radiotherapy may have a poorer prognosis because of the development of more aggressive cancer after treatment. Similarly, white blood cell counts and C-reactive protein are markers of inflammation and are linked to poor tumor-specific immunity.
“Evidence suggests that clinical judgment alone is imprecise for estimation of patient survival, highlighted by the fact that physicians are ineffective in excluding study participants with poor prognosis from large clinical trials,” the authors wrote. “Although future confirmatory studies are required, the PROMISE score (either clinical or biological) could potentially be used in everyday clinical practice as a method to improve patient management and reduce associated health-care costs, and as an enrichment strategy for future clinical trials.”
The study was supported by the Oxford Respiratory Trials Unit, Medical Research Funding-University of Oxford, Slater & Gordon Research Fund, and Oxfordshire Health Services Research Committee Research Grants. No conflicts of interest were declared.
SOURCE: Psallidas I et al. Lancet Oncol. 2018 Jun 13. doi: 10.1016/S1470-2045(18)30294-8.
FROM THE LANCET ONCOLOGY
Key clinical point: Clinical and biological markers predict 3-month mortality in malignant pleural effusion.
Major finding: The PROMISE score shows good discrimination between high and low risk of death in malignant pleural effusion.
Study details: Development and validation of markers in multicohort study.
Disclosures: The study was supported by the Oxford Respiratory Trials Unit, Medical Research Funding–University of Oxford, Slater & Gordon Research Fund, and Oxfordshire Health Services Research Committee Research Grants. No conflicts of interest were declared.
Source: Psallidas I et al. Lancet Oncol. 2018 Jun 13. doi: 10.1016/S1470-2045(18)30294-8.
New chronic lymphocytic leukemia guidelines from the UK
Fludarabine, cyclophosphamide, and rituximab are recommended as initial therapy for patients with chronic lymphocytic leukemia who do not have TP53 disruption, according to new guidelines from the British Society for Haematology.
The guidelines update the 2012 recommendations on chronic lymphocytic leukemia (CLL) to include “significant” developments in treatment. They were published in the British Journal of Haematology.
Anna H. Schuh, MD, of the department of oncology at the University of Oxford (England), and her coauthors noted that, while these guidelines apply to treatments available outside clinical trials, wherever possible patients with CLL should be treated within the clinical trial setting.
While recommending fludarabine, cyclophosphamide, and rituximab as first-line therapy, the guideline authors acknowledged that the combination of bendamustine and rituximab is an acceptable alternative for patients who could not take the triple therapy because of comorbidities such as advanced age, renal impairment, or issues with marrow capacity.
Similarly, less-fit patients could also be considered for chlorambucil-obinutuzumab or chlorambucil-ofatumumab combinations.
All patients diagnosed with CLL should be tested for TP53 deletions and mutations before each line of therapy, the guideline committee recommended. TP53 disruption makes chemoimmunotherapy ineffective because of either a deletion of chromosome 17p or a mutation in the TP53 gene. However, there is compelling evidence for the efficacy of ibrutinib in these patients, or idelalisib and rituximab for those with cardiac disease or receiving vitamin K antagonists.
With respect to maintenance therapy, the guidelines noted that this was not routinely recommended in CLL as “it is unclear to what extent the progression-free survival benefit is offset by long-term toxicity.”
Patients who are refractory to chemoimmunotherapy, who have relapsed, or who cannot be retreated with chemoimmunotherapy should be treated with idelalisib with rituximab or ibrutinib monotherapy, the guidelines suggested.
“Deciding whether ibrutinib or idelalisib with rituximab is most appropriate for an individual patient depends on a range of factors, including toxicity profile and convenience of delivery,” the authors wrote. However, they noted that the value of adding bendamustine to either option was unclear as research had not shown significant, associated gains in median progression-free survival.
Allogeneic stem cell transplantation should be considered as a treatment option for patients who have either failed chemotherapy, have a TP53 disruption and have not responded to B-cell receptor signaling pathway inhibitors such as ibrutinib, or have a Richter transformation.
The guidelines also addressed the issue of autoimmune cytopenias, which occur in 5%-10% of patients with CLL and can actually precede the diagnosis of CLL in about 9% of cases.
In patients where autoimmune cytopenia is the dominant clinical feature, they should be treated with corticosteroids, intravenous immunoglobulin, or rituximab. However, for patients where the cytopenia is triggered by CLL therapy, the guidelines recommended halting treatment and beginning immunosuppression.
The guideline development was supported by the British Society for Haematology. The UK CLL Forum is a registered charity that receives funding from a number of pharmaceutical companies.
SOURCE: Schuh AH et al. Br J Haematol. 2018 Jul 15. doi: 10.1111/bjh.15460.
Fludarabine, cyclophosphamide, and rituximab are recommended as initial therapy for patients with chronic lymphocytic leukemia who do not have TP53 disruption, according to new guidelines from the British Society for Haematology.
The guidelines update the 2012 recommendations on chronic lymphocytic leukemia (CLL) to include “significant” developments in treatment. They were published in the British Journal of Haematology.
Anna H. Schuh, MD, of the department of oncology at the University of Oxford (England), and her coauthors noted that, while these guidelines apply to treatments available outside clinical trials, wherever possible patients with CLL should be treated within the clinical trial setting.
While recommending fludarabine, cyclophosphamide, and rituximab as first-line therapy, the guideline authors acknowledged that the combination of bendamustine and rituximab is an acceptable alternative for patients who could not take the triple therapy because of comorbidities such as advanced age, renal impairment, or issues with marrow capacity.
Similarly, less-fit patients could also be considered for chlorambucil-obinutuzumab or chlorambucil-ofatumumab combinations.
All patients diagnosed with CLL should be tested for TP53 deletions and mutations before each line of therapy, the guideline committee recommended. TP53 disruption makes chemoimmunotherapy ineffective because of either a deletion of chromosome 17p or a mutation in the TP53 gene. However, there is compelling evidence for the efficacy of ibrutinib in these patients, or idelalisib and rituximab for those with cardiac disease or receiving vitamin K antagonists.
With respect to maintenance therapy, the guidelines noted that this was not routinely recommended in CLL as “it is unclear to what extent the progression-free survival benefit is offset by long-term toxicity.”
Patients who are refractory to chemoimmunotherapy, who have relapsed, or who cannot be retreated with chemoimmunotherapy should be treated with idelalisib with rituximab or ibrutinib monotherapy, the guidelines suggested.
“Deciding whether ibrutinib or idelalisib with rituximab is most appropriate for an individual patient depends on a range of factors, including toxicity profile and convenience of delivery,” the authors wrote. However, they noted that the value of adding bendamustine to either option was unclear as research had not shown significant, associated gains in median progression-free survival.
Allogeneic stem cell transplantation should be considered as a treatment option for patients who have either failed chemotherapy, have a TP53 disruption and have not responded to B-cell receptor signaling pathway inhibitors such as ibrutinib, or have a Richter transformation.
The guidelines also addressed the issue of autoimmune cytopenias, which occur in 5%-10% of patients with CLL and can actually precede the diagnosis of CLL in about 9% of cases.
In patients where autoimmune cytopenia is the dominant clinical feature, they should be treated with corticosteroids, intravenous immunoglobulin, or rituximab. However, for patients where the cytopenia is triggered by CLL therapy, the guidelines recommended halting treatment and beginning immunosuppression.
The guideline development was supported by the British Society for Haematology. The UK CLL Forum is a registered charity that receives funding from a number of pharmaceutical companies.
SOURCE: Schuh AH et al. Br J Haematol. 2018 Jul 15. doi: 10.1111/bjh.15460.
Fludarabine, cyclophosphamide, and rituximab are recommended as initial therapy for patients with chronic lymphocytic leukemia who do not have TP53 disruption, according to new guidelines from the British Society for Haematology.
The guidelines update the 2012 recommendations on chronic lymphocytic leukemia (CLL) to include “significant” developments in treatment. They were published in the British Journal of Haematology.
Anna H. Schuh, MD, of the department of oncology at the University of Oxford (England), and her coauthors noted that, while these guidelines apply to treatments available outside clinical trials, wherever possible patients with CLL should be treated within the clinical trial setting.
While recommending fludarabine, cyclophosphamide, and rituximab as first-line therapy, the guideline authors acknowledged that the combination of bendamustine and rituximab is an acceptable alternative for patients who could not take the triple therapy because of comorbidities such as advanced age, renal impairment, or issues with marrow capacity.
Similarly, less-fit patients could also be considered for chlorambucil-obinutuzumab or chlorambucil-ofatumumab combinations.
All patients diagnosed with CLL should be tested for TP53 deletions and mutations before each line of therapy, the guideline committee recommended. TP53 disruption makes chemoimmunotherapy ineffective because of either a deletion of chromosome 17p or a mutation in the TP53 gene. However, there is compelling evidence for the efficacy of ibrutinib in these patients, or idelalisib and rituximab for those with cardiac disease or receiving vitamin K antagonists.
With respect to maintenance therapy, the guidelines noted that this was not routinely recommended in CLL as “it is unclear to what extent the progression-free survival benefit is offset by long-term toxicity.”
Patients who are refractory to chemoimmunotherapy, who have relapsed, or who cannot be retreated with chemoimmunotherapy should be treated with idelalisib with rituximab or ibrutinib monotherapy, the guidelines suggested.
“Deciding whether ibrutinib or idelalisib with rituximab is most appropriate for an individual patient depends on a range of factors, including toxicity profile and convenience of delivery,” the authors wrote. However, they noted that the value of adding bendamustine to either option was unclear as research had not shown significant, associated gains in median progression-free survival.
Allogeneic stem cell transplantation should be considered as a treatment option for patients who have either failed chemotherapy, have a TP53 disruption and have not responded to B-cell receptor signaling pathway inhibitors such as ibrutinib, or have a Richter transformation.
The guidelines also addressed the issue of autoimmune cytopenias, which occur in 5%-10% of patients with CLL and can actually precede the diagnosis of CLL in about 9% of cases.
In patients where autoimmune cytopenia is the dominant clinical feature, they should be treated with corticosteroids, intravenous immunoglobulin, or rituximab. However, for patients where the cytopenia is triggered by CLL therapy, the guidelines recommended halting treatment and beginning immunosuppression.
The guideline development was supported by the British Society for Haematology. The UK CLL Forum is a registered charity that receives funding from a number of pharmaceutical companies.
SOURCE: Schuh AH et al. Br J Haematol. 2018 Jul 15. doi: 10.1111/bjh.15460.
FROM THE BRITISH JOURNAL OF HAEMATOLOGY
Key clinical point:
Major finding: All patients diagnosed with CLL should be tested for TP53 disruption.
Study details: A guideline developed by the British Society for Haematology offering recommendations for CLL treatment outside clinical trials.
Disclosures: The guideline development was supported by the British Society for Haematology. The UK CLL Forum is a registered charity that receives funding from a number of pharmaceutical companies.
Source: Schuh AH et al. Br J Haematol. 2018 Jul 15. doi: 10.1111/bjh.15460.
Methylphenidate deemed best first-line option for ADHD in children
Methylphenidate appears to be the safest and most effective treatment option for attention-deficit/hyperactivity disorder in children and adolescents, while amphetamines are the preferred first-line choice in adults, a systematic review and meta-analysis have found.
Researchers reported the results of a network meta-analysis of 133 double-blind randomized controlled trials – 81 in children and adolescents, 51 in adults, and 1 in both – involving a total of 10,068 children and adolescents, and 8,131 adults. The included studies all compared a range of medications to placebo or in head-to-head trials. The meta-analysis was published online Aug. 7 in The Lancet Psychiatry.
At 12 weeks, all the medications, which included amphetamines, atomoxetine, bupropion, clonidine, guanfacine, methylphenidate, and modafinil, were found to be better than placebo in reducing core ADHD symptoms in children and adolescents, according to clinicians’ ratings. However, when teachers’ ratings were used, only methylphenidate and modafinil were better than placebo.
In adults, clinicians’ ratings found that amphetamines, methylphenidate, bupropion, and atomoxetine – but not modafinil – were better than placebo.
In head-to-head trials, clinicians’ ratings favored amphetamines over modafinil, atomoxetine, and methylphenidate in children, adolescents, and adults.
But in adults, amphetamines, bupropion, and methylphenidate all beat placebo.
When it came to tolerability in children and adolescents, guanfacine and amphetamines were the only two treatments that were less well tolerated than placebo. However, a post hoc analysis suggested lisdexamfetamine had a lower tolerability relative to other amphetamines, at least in children and adolescents. In adults, modafinil, amphetamines, methylphenidate, and atomoxetine were beaten by placebo for tolerability.
“Overall, all medications, except modafinil in adults, were more efficacious than placebo for the short-term treatment of ADHD, and they were less efficacious and less well tolerated in adults than in children and adolescents,” wrote Samuele Cortese, MD, PhD, of the University of Southampton (England), and his coauthors. “However, the included medications were not equivalent in relation to their mean effect size, which ranged from moderate to high and varied according to the type of rater.”
For example, while atomoxetine had the lowest mean effect size in children and adolescents based on clinicians’ ratings, in adults, it was on par with methylphenidate. Amphetamines increased diastolic blood pressure in children but not in adults.
“Taking into account both efficacy and safety, evidence from this meta-analysis supports methylphenidate in children and adolescents, and amphetamines in adults, as preferred first-choice medications for the short-term treatment of ADHD,” the authors wrote.
Dr. Cortese and his coauthors cited a few limitations. One is that the most recent study included in their meta-analysis was published in April 2017. When the researchers conducted a PubMed search in May 2018, they found three additional studies that met their criteria. “Since we already had 133 included studies, we decided that adding these three studies would not have changed the final results materially,” they wrote.
The study was supported by the Stichting Eunethydis (European Network for Hyperkinetic Disorders), and the U.K. National Institute for Health Research Oxford Health Biomedical Research Centre. Nine authors declared support, funding, or advisory roles with a range of organizations or the pharmaceutical industry.
SOURCE: Cortese S et al. Lancet Psychiatry. 2018 Aug 7. doi: 10.1016/S2215-0366(18)30269-4.
Methylphenidate appears to be the safest and most effective treatment option for attention-deficit/hyperactivity disorder in children and adolescents, while amphetamines are the preferred first-line choice in adults, a systematic review and meta-analysis have found.
Researchers reported the results of a network meta-analysis of 133 double-blind randomized controlled trials – 81 in children and adolescents, 51 in adults, and 1 in both – involving a total of 10,068 children and adolescents, and 8,131 adults. The included studies all compared a range of medications to placebo or in head-to-head trials. The meta-analysis was published online Aug. 7 in The Lancet Psychiatry.
At 12 weeks, all the medications, which included amphetamines, atomoxetine, bupropion, clonidine, guanfacine, methylphenidate, and modafinil, were found to be better than placebo in reducing core ADHD symptoms in children and adolescents, according to clinicians’ ratings. However, when teachers’ ratings were used, only methylphenidate and modafinil were better than placebo.
In adults, clinicians’ ratings found that amphetamines, methylphenidate, bupropion, and atomoxetine – but not modafinil – were better than placebo.
In head-to-head trials, clinicians’ ratings favored amphetamines over modafinil, atomoxetine, and methylphenidate in children, adolescents, and adults.
But in adults, amphetamines, bupropion, and methylphenidate all beat placebo.
When it came to tolerability in children and adolescents, guanfacine and amphetamines were the only two treatments that were less well tolerated than placebo. However, a post hoc analysis suggested lisdexamfetamine had a lower tolerability relative to other amphetamines, at least in children and adolescents. In adults, modafinil, amphetamines, methylphenidate, and atomoxetine were beaten by placebo for tolerability.
“Overall, all medications, except modafinil in adults, were more efficacious than placebo for the short-term treatment of ADHD, and they were less efficacious and less well tolerated in adults than in children and adolescents,” wrote Samuele Cortese, MD, PhD, of the University of Southampton (England), and his coauthors. “However, the included medications were not equivalent in relation to their mean effect size, which ranged from moderate to high and varied according to the type of rater.”
For example, while atomoxetine had the lowest mean effect size in children and adolescents based on clinicians’ ratings, in adults, it was on par with methylphenidate. Amphetamines increased diastolic blood pressure in children but not in adults.
“Taking into account both efficacy and safety, evidence from this meta-analysis supports methylphenidate in children and adolescents, and amphetamines in adults, as preferred first-choice medications for the short-term treatment of ADHD,” the authors wrote.
Dr. Cortese and his coauthors cited a few limitations. One is that the most recent study included in their meta-analysis was published in April 2017. When the researchers conducted a PubMed search in May 2018, they found three additional studies that met their criteria. “Since we already had 133 included studies, we decided that adding these three studies would not have changed the final results materially,” they wrote.
The study was supported by the Stichting Eunethydis (European Network for Hyperkinetic Disorders), and the U.K. National Institute for Health Research Oxford Health Biomedical Research Centre. Nine authors declared support, funding, or advisory roles with a range of organizations or the pharmaceutical industry.
SOURCE: Cortese S et al. Lancet Psychiatry. 2018 Aug 7. doi: 10.1016/S2215-0366(18)30269-4.
Methylphenidate appears to be the safest and most effective treatment option for attention-deficit/hyperactivity disorder in children and adolescents, while amphetamines are the preferred first-line choice in adults, a systematic review and meta-analysis have found.
Researchers reported the results of a network meta-analysis of 133 double-blind randomized controlled trials – 81 in children and adolescents, 51 in adults, and 1 in both – involving a total of 10,068 children and adolescents, and 8,131 adults. The included studies all compared a range of medications to placebo or in head-to-head trials. The meta-analysis was published online Aug. 7 in The Lancet Psychiatry.
At 12 weeks, all the medications, which included amphetamines, atomoxetine, bupropion, clonidine, guanfacine, methylphenidate, and modafinil, were found to be better than placebo in reducing core ADHD symptoms in children and adolescents, according to clinicians’ ratings. However, when teachers’ ratings were used, only methylphenidate and modafinil were better than placebo.
In adults, clinicians’ ratings found that amphetamines, methylphenidate, bupropion, and atomoxetine – but not modafinil – were better than placebo.
In head-to-head trials, clinicians’ ratings favored amphetamines over modafinil, atomoxetine, and methylphenidate in children, adolescents, and adults.
But in adults, amphetamines, bupropion, and methylphenidate all beat placebo.
When it came to tolerability in children and adolescents, guanfacine and amphetamines were the only two treatments that were less well tolerated than placebo. However, a post hoc analysis suggested lisdexamfetamine had a lower tolerability relative to other amphetamines, at least in children and adolescents. In adults, modafinil, amphetamines, methylphenidate, and atomoxetine were beaten by placebo for tolerability.
“Overall, all medications, except modafinil in adults, were more efficacious than placebo for the short-term treatment of ADHD, and they were less efficacious and less well tolerated in adults than in children and adolescents,” wrote Samuele Cortese, MD, PhD, of the University of Southampton (England), and his coauthors. “However, the included medications were not equivalent in relation to their mean effect size, which ranged from moderate to high and varied according to the type of rater.”
For example, while atomoxetine had the lowest mean effect size in children and adolescents based on clinicians’ ratings, in adults, it was on par with methylphenidate. Amphetamines increased diastolic blood pressure in children but not in adults.
“Taking into account both efficacy and safety, evidence from this meta-analysis supports methylphenidate in children and adolescents, and amphetamines in adults, as preferred first-choice medications for the short-term treatment of ADHD,” the authors wrote.
Dr. Cortese and his coauthors cited a few limitations. One is that the most recent study included in their meta-analysis was published in April 2017. When the researchers conducted a PubMed search in May 2018, they found three additional studies that met their criteria. “Since we already had 133 included studies, we decided that adding these three studies would not have changed the final results materially,” they wrote.
The study was supported by the Stichting Eunethydis (European Network for Hyperkinetic Disorders), and the U.K. National Institute for Health Research Oxford Health Biomedical Research Centre. Nine authors declared support, funding, or advisory roles with a range of organizations or the pharmaceutical industry.
SOURCE: Cortese S et al. Lancet Psychiatry. 2018 Aug 7. doi: 10.1016/S2215-0366(18)30269-4.
FROM THE LANCET PSYCHIATRY
Key clinical point: “All medications, except modafinil in adults, were more efficacious than placebo for the short-term treatment of ADHD.”
Major finding: Methylphenidate showed the greatest tolerability and efficacy of ADHD treatments for children and adolescents.
Study details: Systematic review and meta-analysis of 133 double-blind randomized controlled trials.
Disclosures: The study was supported by the Stichting Eunethydis (European Network for Hyperkinetic Disorders), and the U.K. National Institute for Health Research Oxford Health Biomedical Research Centre. Nine authors declared support, funding, or advisory roles with a range of organizations or the pharmaceutical industry.
Source: Cortese S et al. Lancet Psychiatry. 2018 Aug 7. doi: 10.1016/S2215-0366(18)30269-4.
Telomere length linked to COPD exacerbations, mortality
according to a study published in
The evidence suggests that chronic obstructive pulmonary disease (COPD) may be a disease of accelerated aging, partly because of its relation to other senescence-related disorders such as osteoporosis and dementia, but also because it shows an exponential increase in prevalence in older age.
Telomere lengths are a measure of cellular senescence, and previous research has found that the telomeres are shortened in the peripheral leukocytes of patients with COPD, compared with healthy controls.
In this study, researchers examined the absolute telomere length of 576 people with moderate to severe COPD who were participating in the MACRO (Macrolide Azithromycin for Prevention of Exacerbations of COPD) study.
They found that individuals in the lowest quartile of telomere lengths had significantly worse health status and a higher exacerbation rate after accounting for treatment, compared with individuals in the higher quartile.
Patients with shorter telomere length had worse health status, as defined by higher St. George’s Respiratory Questionnaire scores. In the placebo arm of the study, the exacerbation rate (rate ratio, 1.50; 95% confidence interval, 1.16-1.95; P = .002) and mortality risk (hazard ratio, 9.45; 95% CI, 2.85-31.36; P = .015) were significantly higher in the shorter telomere group than in the longer telomere group; these differences were not observed in the azithromycin arm.
Patients with shorter telomeres also had a 800% higher risk of total mortality, compared with individuals with longer telomeres, although this was only evident in the placebo arm of the study, not the azithromycin arm. However, the authors noted that these data should be interpreted with caution because of the small number of deaths during the study.
“Together, these data support the notion that COPD is a systemic disease of accelerated aging and that replicative senescence, denoted by peripheral blood telomeres, is associated with poor health outcomes in COPD,” wrote Minhee Jin, of the University of British Columbia, Vancouver, and coauthors.
“It is now well established that replicative senescence results in a change of cellular phenotype to a proinflammatory state, a process that has been referred to as senescence-associated secretory phenotype,” they added.
The study also found that the median value for telomere length across the study participants – who had a mean age of 66 years – was equivalent to the expected value for someone in their 80s, “suggesting that on average MACRO participants were biologically much older than their chronological age.”
Researchers also noted that patients in the lowest quartile of telomere length had significantly lower forced vital capacity values, which suggested shorter telomeres could be a biomarker of restrictive physiology.
MACRO was funded by the U.S. National Heart, Lung, and Blood Institute, and the biomarker component of the study was funded by the Canadian Respiratory Research Network, Genome Canada, and the St. Paul’s Hospital Foundation. One author was an employee of GenomeDx Biosciences, three declared funding from or consultancies with the pharmaceutical industry. No other conflicts of interest were reported.
SOURCE: Jin M et al. Chest. 2018 Jul 12. doi: 10.1016/j.chest.2018.05.022.
according to a study published in
The evidence suggests that chronic obstructive pulmonary disease (COPD) may be a disease of accelerated aging, partly because of its relation to other senescence-related disorders such as osteoporosis and dementia, but also because it shows an exponential increase in prevalence in older age.
Telomere lengths are a measure of cellular senescence, and previous research has found that the telomeres are shortened in the peripheral leukocytes of patients with COPD, compared with healthy controls.
In this study, researchers examined the absolute telomere length of 576 people with moderate to severe COPD who were participating in the MACRO (Macrolide Azithromycin for Prevention of Exacerbations of COPD) study.
They found that individuals in the lowest quartile of telomere lengths had significantly worse health status and a higher exacerbation rate after accounting for treatment, compared with individuals in the higher quartile.
Patients with shorter telomere length had worse health status, as defined by higher St. George’s Respiratory Questionnaire scores. In the placebo arm of the study, the exacerbation rate (rate ratio, 1.50; 95% confidence interval, 1.16-1.95; P = .002) and mortality risk (hazard ratio, 9.45; 95% CI, 2.85-31.36; P = .015) were significantly higher in the shorter telomere group than in the longer telomere group; these differences were not observed in the azithromycin arm.
Patients with shorter telomeres also had a 800% higher risk of total mortality, compared with individuals with longer telomeres, although this was only evident in the placebo arm of the study, not the azithromycin arm. However, the authors noted that these data should be interpreted with caution because of the small number of deaths during the study.
“Together, these data support the notion that COPD is a systemic disease of accelerated aging and that replicative senescence, denoted by peripheral blood telomeres, is associated with poor health outcomes in COPD,” wrote Minhee Jin, of the University of British Columbia, Vancouver, and coauthors.
“It is now well established that replicative senescence results in a change of cellular phenotype to a proinflammatory state, a process that has been referred to as senescence-associated secretory phenotype,” they added.
The study also found that the median value for telomere length across the study participants – who had a mean age of 66 years – was equivalent to the expected value for someone in their 80s, “suggesting that on average MACRO participants were biologically much older than their chronological age.”
Researchers also noted that patients in the lowest quartile of telomere length had significantly lower forced vital capacity values, which suggested shorter telomeres could be a biomarker of restrictive physiology.
MACRO was funded by the U.S. National Heart, Lung, and Blood Institute, and the biomarker component of the study was funded by the Canadian Respiratory Research Network, Genome Canada, and the St. Paul’s Hospital Foundation. One author was an employee of GenomeDx Biosciences, three declared funding from or consultancies with the pharmaceutical industry. No other conflicts of interest were reported.
SOURCE: Jin M et al. Chest. 2018 Jul 12. doi: 10.1016/j.chest.2018.05.022.
according to a study published in
The evidence suggests that chronic obstructive pulmonary disease (COPD) may be a disease of accelerated aging, partly because of its relation to other senescence-related disorders such as osteoporosis and dementia, but also because it shows an exponential increase in prevalence in older age.
Telomere lengths are a measure of cellular senescence, and previous research has found that the telomeres are shortened in the peripheral leukocytes of patients with COPD, compared with healthy controls.
In this study, researchers examined the absolute telomere length of 576 people with moderate to severe COPD who were participating in the MACRO (Macrolide Azithromycin for Prevention of Exacerbations of COPD) study.
They found that individuals in the lowest quartile of telomere lengths had significantly worse health status and a higher exacerbation rate after accounting for treatment, compared with individuals in the higher quartile.
Patients with shorter telomere length had worse health status, as defined by higher St. George’s Respiratory Questionnaire scores. In the placebo arm of the study, the exacerbation rate (rate ratio, 1.50; 95% confidence interval, 1.16-1.95; P = .002) and mortality risk (hazard ratio, 9.45; 95% CI, 2.85-31.36; P = .015) were significantly higher in the shorter telomere group than in the longer telomere group; these differences were not observed in the azithromycin arm.
Patients with shorter telomeres also had a 800% higher risk of total mortality, compared with individuals with longer telomeres, although this was only evident in the placebo arm of the study, not the azithromycin arm. However, the authors noted that these data should be interpreted with caution because of the small number of deaths during the study.
“Together, these data support the notion that COPD is a systemic disease of accelerated aging and that replicative senescence, denoted by peripheral blood telomeres, is associated with poor health outcomes in COPD,” wrote Minhee Jin, of the University of British Columbia, Vancouver, and coauthors.
“It is now well established that replicative senescence results in a change of cellular phenotype to a proinflammatory state, a process that has been referred to as senescence-associated secretory phenotype,” they added.
The study also found that the median value for telomere length across the study participants – who had a mean age of 66 years – was equivalent to the expected value for someone in their 80s, “suggesting that on average MACRO participants were biologically much older than their chronological age.”
Researchers also noted that patients in the lowest quartile of telomere length had significantly lower forced vital capacity values, which suggested shorter telomeres could be a biomarker of restrictive physiology.
MACRO was funded by the U.S. National Heart, Lung, and Blood Institute, and the biomarker component of the study was funded by the Canadian Respiratory Research Network, Genome Canada, and the St. Paul’s Hospital Foundation. One author was an employee of GenomeDx Biosciences, three declared funding from or consultancies with the pharmaceutical industry. No other conflicts of interest were reported.
SOURCE: Jin M et al. Chest. 2018 Jul 12. doi: 10.1016/j.chest.2018.05.022.
FROM CHEST
Key clinical point: Shorter telomeres are linked to an increased risk of chronic obstructive pulmonary disease exacerbations.
Major finding: Patients with shorter telomeres had a 800% higher risk of total mortality, compared with individuals with longer telomeres.
Study details: Data from 576 patients with chronic obstructive pulmonary disease who participated in the MACRO study.
Disclosures: MACRO was funded by the U.S. National Heart, Lung, and Blood Institute, and the biomarker component of the study was funded by the Canadian Respiratory Research Network and the Canadian Institutes of Health Research Genome Canada, and the St. Paul’s Hospital Foundation. One author was an employee of GenomeDx Biosciences, and three authors declared funding from or consultancies with the pharmaceutical industry. No other conflicts of interest were reported.
Source: Jin M et al. Chest. 2018 Jul 12. doi: 10.1016/j.chest.2018.05.022.
Peer-comparison letters reduce physician quetiapine prescribing
A behavioral “nudge” intervention, targeting primary care prescribers who have particularly high off-label prescription rates of the antipsychotic quetiapine fumarate to older and disabled adults, has shown significant and long-lasting reductions in prescriptions.
A study, published Aug. 1 online by JAMA Psychiatry, looked at the effect of a “peer-comparison” letter, compared with a placebo letter, sent to 5,055 high quetiapine-prescribing primary care physicians in the Medicare program.
The letters said that the physicians’ quetiapine prescribing was extremely high, compared with their peers’ prescribing in the same state. Furthermore, the letters said the high-volume prescribers’ practices were under review because of concerns over medically unjustified use. They also encouraged the doctors to review their prescribing habits, while the placebo letter simply discussed an unrelated Medicare enrollment regulation.
Over the 9-month study, researchers saw a significant 11.1% reduction in the total number days of quetiapine prescribing among physicians who received the intervention letter, compared with those who received the control letter (95% confidence interval, –13.1 to –9.2 days; P less than .001; adjusted difference, –319 days; 95% CI, –374 to –263 days; P less than .001). At 2 years, the cumulative reduction was 15.6% fewer days in the intervention group (95% CI, –18.1 to –13.0; P less than .001), compared with the control group.
The study also used Medicare data to look at the impact on patients and found that individuals whose physicians were in the intervention arm had 3.9% fewer days of quetiapine usage over the 9 months (95% CI, –5.0 to –2.9; P less than 0.11), compared with those in the control arm. The reduction was even greater among patients whose indications for quetiapine were deemed to be of “low value,” as opposed to those who were prescribed for guideline-concordant indications, reported Adam Sacarny, PhD, of Columbia University, New York, and his coauthors.
When researchers looked in more detail at the reductions in prescriptions for guideline-concordant patients, they found that much of this was offset by prescriptions from other prescribers; in particular, physicians with psychiatric specialization or other study prescribers who the patient had not previously received a quetiapine prescription from.
The authors noted that the reductions for guideline-concordant patients could have negative effects if prescribers were reducing their quetiapine prescriptions indiscriminately.
“If this represented a harmful change for patients, we may have expected to see higher rates of adverse outcomes in the guideline-concordant patient group as prescribing rates decreased,” wrote Dr. Sacarny, and his coauthors. “However, if anything, guideline-concordant patients experienced lower rates of hospital encounters after the intervention.”
The study did not see any evidence of substitution to other antipsychotics, nor was any significant difference found in hospital use or mortality between the two arms of the study.
Dr. Sacarny and his coauthors cited several limitations. One is that the analysis looked at prescribing covered by Medicare Part D only. Nevertheless, they said, the results show the impact that peer comparison letters can have on prescribing patterns.
“These results provide encouraging evidence that high prescribers of antipsychotics can decrease quetiapine prescribing, without adverse clinical consequences, in response to a letter highlighting their overall high rates of prescribing,” the authors wrote.
The study was supported by the Robert Wood Johnson Foundation, Abdul Latif Jameel Poverty Action Lab North America, and the Laura and John Arnold Foundation. No conflicts of interest were reported.
SOURCE: Sacarny A et al. JAMA Psychiatry. 2018 Aug 1. doi: 10.1001/jamapsychiatry.2018.1867.
A behavioral “nudge” intervention, targeting primary care prescribers who have particularly high off-label prescription rates of the antipsychotic quetiapine fumarate to older and disabled adults, has shown significant and long-lasting reductions in prescriptions.
A study, published Aug. 1 online by JAMA Psychiatry, looked at the effect of a “peer-comparison” letter, compared with a placebo letter, sent to 5,055 high quetiapine-prescribing primary care physicians in the Medicare program.
The letters said that the physicians’ quetiapine prescribing was extremely high, compared with their peers’ prescribing in the same state. Furthermore, the letters said the high-volume prescribers’ practices were under review because of concerns over medically unjustified use. They also encouraged the doctors to review their prescribing habits, while the placebo letter simply discussed an unrelated Medicare enrollment regulation.
Over the 9-month study, researchers saw a significant 11.1% reduction in the total number days of quetiapine prescribing among physicians who received the intervention letter, compared with those who received the control letter (95% confidence interval, –13.1 to –9.2 days; P less than .001; adjusted difference, –319 days; 95% CI, –374 to –263 days; P less than .001). At 2 years, the cumulative reduction was 15.6% fewer days in the intervention group (95% CI, –18.1 to –13.0; P less than .001), compared with the control group.
The study also used Medicare data to look at the impact on patients and found that individuals whose physicians were in the intervention arm had 3.9% fewer days of quetiapine usage over the 9 months (95% CI, –5.0 to –2.9; P less than 0.11), compared with those in the control arm. The reduction was even greater among patients whose indications for quetiapine were deemed to be of “low value,” as opposed to those who were prescribed for guideline-concordant indications, reported Adam Sacarny, PhD, of Columbia University, New York, and his coauthors.
When researchers looked in more detail at the reductions in prescriptions for guideline-concordant patients, they found that much of this was offset by prescriptions from other prescribers; in particular, physicians with psychiatric specialization or other study prescribers who the patient had not previously received a quetiapine prescription from.
The authors noted that the reductions for guideline-concordant patients could have negative effects if prescribers were reducing their quetiapine prescriptions indiscriminately.
“If this represented a harmful change for patients, we may have expected to see higher rates of adverse outcomes in the guideline-concordant patient group as prescribing rates decreased,” wrote Dr. Sacarny, and his coauthors. “However, if anything, guideline-concordant patients experienced lower rates of hospital encounters after the intervention.”
The study did not see any evidence of substitution to other antipsychotics, nor was any significant difference found in hospital use or mortality between the two arms of the study.
Dr. Sacarny and his coauthors cited several limitations. One is that the analysis looked at prescribing covered by Medicare Part D only. Nevertheless, they said, the results show the impact that peer comparison letters can have on prescribing patterns.
“These results provide encouraging evidence that high prescribers of antipsychotics can decrease quetiapine prescribing, without adverse clinical consequences, in response to a letter highlighting their overall high rates of prescribing,” the authors wrote.
The study was supported by the Robert Wood Johnson Foundation, Abdul Latif Jameel Poverty Action Lab North America, and the Laura and John Arnold Foundation. No conflicts of interest were reported.
SOURCE: Sacarny A et al. JAMA Psychiatry. 2018 Aug 1. doi: 10.1001/jamapsychiatry.2018.1867.
A behavioral “nudge” intervention, targeting primary care prescribers who have particularly high off-label prescription rates of the antipsychotic quetiapine fumarate to older and disabled adults, has shown significant and long-lasting reductions in prescriptions.
A study, published Aug. 1 online by JAMA Psychiatry, looked at the effect of a “peer-comparison” letter, compared with a placebo letter, sent to 5,055 high quetiapine-prescribing primary care physicians in the Medicare program.
The letters said that the physicians’ quetiapine prescribing was extremely high, compared with their peers’ prescribing in the same state. Furthermore, the letters said the high-volume prescribers’ practices were under review because of concerns over medically unjustified use. They also encouraged the doctors to review their prescribing habits, while the placebo letter simply discussed an unrelated Medicare enrollment regulation.
Over the 9-month study, researchers saw a significant 11.1% reduction in the total number days of quetiapine prescribing among physicians who received the intervention letter, compared with those who received the control letter (95% confidence interval, –13.1 to –9.2 days; P less than .001; adjusted difference, –319 days; 95% CI, –374 to –263 days; P less than .001). At 2 years, the cumulative reduction was 15.6% fewer days in the intervention group (95% CI, –18.1 to –13.0; P less than .001), compared with the control group.
The study also used Medicare data to look at the impact on patients and found that individuals whose physicians were in the intervention arm had 3.9% fewer days of quetiapine usage over the 9 months (95% CI, –5.0 to –2.9; P less than 0.11), compared with those in the control arm. The reduction was even greater among patients whose indications for quetiapine were deemed to be of “low value,” as opposed to those who were prescribed for guideline-concordant indications, reported Adam Sacarny, PhD, of Columbia University, New York, and his coauthors.
When researchers looked in more detail at the reductions in prescriptions for guideline-concordant patients, they found that much of this was offset by prescriptions from other prescribers; in particular, physicians with psychiatric specialization or other study prescribers who the patient had not previously received a quetiapine prescription from.
The authors noted that the reductions for guideline-concordant patients could have negative effects if prescribers were reducing their quetiapine prescriptions indiscriminately.
“If this represented a harmful change for patients, we may have expected to see higher rates of adverse outcomes in the guideline-concordant patient group as prescribing rates decreased,” wrote Dr. Sacarny, and his coauthors. “However, if anything, guideline-concordant patients experienced lower rates of hospital encounters after the intervention.”
The study did not see any evidence of substitution to other antipsychotics, nor was any significant difference found in hospital use or mortality between the two arms of the study.
Dr. Sacarny and his coauthors cited several limitations. One is that the analysis looked at prescribing covered by Medicare Part D only. Nevertheless, they said, the results show the impact that peer comparison letters can have on prescribing patterns.
“These results provide encouraging evidence that high prescribers of antipsychotics can decrease quetiapine prescribing, without adverse clinical consequences, in response to a letter highlighting their overall high rates of prescribing,” the authors wrote.
The study was supported by the Robert Wood Johnson Foundation, Abdul Latif Jameel Poverty Action Lab North America, and the Laura and John Arnold Foundation. No conflicts of interest were reported.
SOURCE: Sacarny A et al. JAMA Psychiatry. 2018 Aug 1. doi: 10.1001/jamapsychiatry.2018.1867.
FROM JAMA PSYCHIATRY
Key clinical point: Letter intervention significantly reduces quetiapine prescription rates by physicians.
Major finding: Peer-comparison letters achieved an 11.1% reduction in days of quetiapine prescribed (95% confidence interval, –13.1 to –9.2 days; P less than .001; adjusted difference, –319 days; 95% CI, –374 to –263 days; P less than .001).
Study details: Randomized controlled trial in 5,055 high quetiapine-prescribing rates by primary care physicians.
Disclosures: The study was supported by the Robert Wood Johnson Foundation, Abdul Latif Jameel Poverty Action Lab North America, and the Laura and John Arnold Foundation. No conflicts of interest were declared.
Source: Sacarny A et al. JAMA Psychiatry. 2018 Aug 1. doi: 10.1001/jamapsychiatry.2018.1867.