User login
Drug approved to treat hemophilia A in Kuwait
(Elocta) packaging
Photo courtesy of Sobi
The Ministry of Health in Kuwait has approved efmoroctocog alfa (Elocta®), a recombinant human factor VIII Fc-fusion protein, for the treatment of hemophilia A.
It is indicated for both on-demand and prophylactic treatment in hemophilia A patients of all ages.
Efmoroctocog alfa is the first recombinant factor VIII Fc fusion protein therapy approved for the treatment of hemophilia A in the Middle East region.
Efmoroctocog alfa is also approved in the European Union, Switzerland, Iceland, Liechtenstein, Norway, the US, Canada, Australia, New Zealand, Brazil, Taiwan, and Japan.
Efmoroctocog alfa was developed by fusing B-domain deleted factor VIII to the Fc portion of immunoglobulin G subclass 1. It is believed that this enables efmoroctocog alfa to utilize a naturally occurring pathway to prolong the time the therapy remains in the body.
Sobi and Biogen are collaborators in the development and commercialization of efmoroctocog alfa for hemophilia A.
The approval of efmoroctocog alfa in Kuwait was based on data from a pair of phase 3 studies: A-LONG and Kids A-LONG.
A-LONG
The A-LONG study included 165 previously treated males 12 years of age and older with severe hemophilia A. Researchers evaluated individualized and weekly prophylaxis to reduce or prevent bleeding episodes and on-demand dosing to treat bleeding episodes.
Prophylaxis with efmoroctocog alfa resulted in low annualized bleeding rates, and a majority of bleeding episodes were controlled with a single injection of efmoroctocog alfa.
None of the patients developed neutralizing antibodies, efmoroctocog alfa was considered well-tolerated, and the product had a prolonged half-life when compared with recombinant factor VIII.
Kids A-LONG
The Kids A-LONG study included 71 boys (younger than 12) with severe hemophilia A who had at least 50 prior exposure days to factor VIII therapies.
The children saw their median annualized bleeding rate decrease with efmoroctocog alfa, and close to half of the children did not have any bleeding episodes while they were receiving efmoroctocog alfa.
None of the patients developed inhibitors, and researchers said adverse events were typical of a pediatric hemophilia population.
(Elocta) packaging
Photo courtesy of Sobi
The Ministry of Health in Kuwait has approved efmoroctocog alfa (Elocta®), a recombinant human factor VIII Fc-fusion protein, for the treatment of hemophilia A.
It is indicated for both on-demand and prophylactic treatment in hemophilia A patients of all ages.
Efmoroctocog alfa is the first recombinant factor VIII Fc fusion protein therapy approved for the treatment of hemophilia A in the Middle East region.
Efmoroctocog alfa is also approved in the European Union, Switzerland, Iceland, Liechtenstein, Norway, the US, Canada, Australia, New Zealand, Brazil, Taiwan, and Japan.
Efmoroctocog alfa was developed by fusing B-domain deleted factor VIII to the Fc portion of immunoglobulin G subclass 1. It is believed that this enables efmoroctocog alfa to utilize a naturally occurring pathway to prolong the time the therapy remains in the body.
Sobi and Biogen are collaborators in the development and commercialization of efmoroctocog alfa for hemophilia A.
The approval of efmoroctocog alfa in Kuwait was based on data from a pair of phase 3 studies: A-LONG and Kids A-LONG.
A-LONG
The A-LONG study included 165 previously treated males 12 years of age and older with severe hemophilia A. Researchers evaluated individualized and weekly prophylaxis to reduce or prevent bleeding episodes and on-demand dosing to treat bleeding episodes.
Prophylaxis with efmoroctocog alfa resulted in low annualized bleeding rates, and a majority of bleeding episodes were controlled with a single injection of efmoroctocog alfa.
None of the patients developed neutralizing antibodies, efmoroctocog alfa was considered well-tolerated, and the product had a prolonged half-life when compared with recombinant factor VIII.
Kids A-LONG
The Kids A-LONG study included 71 boys (younger than 12) with severe hemophilia A who had at least 50 prior exposure days to factor VIII therapies.
The children saw their median annualized bleeding rate decrease with efmoroctocog alfa, and close to half of the children did not have any bleeding episodes while they were receiving efmoroctocog alfa.
None of the patients developed inhibitors, and researchers said adverse events were typical of a pediatric hemophilia population.
(Elocta) packaging
Photo courtesy of Sobi
The Ministry of Health in Kuwait has approved efmoroctocog alfa (Elocta®), a recombinant human factor VIII Fc-fusion protein, for the treatment of hemophilia A.
It is indicated for both on-demand and prophylactic treatment in hemophilia A patients of all ages.
Efmoroctocog alfa is the first recombinant factor VIII Fc fusion protein therapy approved for the treatment of hemophilia A in the Middle East region.
Efmoroctocog alfa is also approved in the European Union, Switzerland, Iceland, Liechtenstein, Norway, the US, Canada, Australia, New Zealand, Brazil, Taiwan, and Japan.
Efmoroctocog alfa was developed by fusing B-domain deleted factor VIII to the Fc portion of immunoglobulin G subclass 1. It is believed that this enables efmoroctocog alfa to utilize a naturally occurring pathway to prolong the time the therapy remains in the body.
Sobi and Biogen are collaborators in the development and commercialization of efmoroctocog alfa for hemophilia A.
The approval of efmoroctocog alfa in Kuwait was based on data from a pair of phase 3 studies: A-LONG and Kids A-LONG.
A-LONG
The A-LONG study included 165 previously treated males 12 years of age and older with severe hemophilia A. Researchers evaluated individualized and weekly prophylaxis to reduce or prevent bleeding episodes and on-demand dosing to treat bleeding episodes.
Prophylaxis with efmoroctocog alfa resulted in low annualized bleeding rates, and a majority of bleeding episodes were controlled with a single injection of efmoroctocog alfa.
None of the patients developed neutralizing antibodies, efmoroctocog alfa was considered well-tolerated, and the product had a prolonged half-life when compared with recombinant factor VIII.
Kids A-LONG
The Kids A-LONG study included 71 boys (younger than 12) with severe hemophilia A who had at least 50 prior exposure days to factor VIII therapies.
The children saw their median annualized bleeding rate decrease with efmoroctocog alfa, and close to half of the children did not have any bleeding episodes while they were receiving efmoroctocog alfa.
None of the patients developed inhibitors, and researchers said adverse events were typical of a pediatric hemophilia population.
Novel CLL drugs could greatly increase costs
New research suggests the increasing use of oral targeted therapies for chronic lymphocytic leukemia (CLL) could raise US treatment costs for the disease by almost 600%.
Investigators modeled the evolving management of CLL from 2011 to 2025 and found that increasing use of the oral targeted therapies ibrutinib and idelalisib could greatly increase costs for both patients and payers.
The team detailed these findings in the Journal of Clinical Oncology.
“The rising cost of cancer care is a serious concern,” said study author Jagpreet Chhatwal, PhD, of Massachusetts General Hospital in Boston.
“The average cost of annual cancer treatment, which was below $10,000 per patient before 2000, has now increased to more than $100,000. Such increasing trends can limit access to new therapies, potentially undermining their clinical effectiveness. These new drugs are highly effective, but their high costs motivated us to project their changing economic burden and affordability.”
Dr Chhatwal and his colleagues noted that ibrutinib and idelalisib each cost around $130,000 per year, and treatment with these drugs may be continued indefinitely.
So the team set out to determine the potential financial impact of the drugs on payers’ budgets, as well as on Medicare-enrolled patients, who represent the majority of CLL patients in the US.
The investigators developed a model to simulate the evolving management of CLL from 2011 to 2025.
In one scenario, chemoimmunotherapy was the standard of care before 2014, while oral targeted therapies were used for patients with del(17p) and relapsed CLL from 2014 onward and for first-line treatment of CLL from 2016 onward.
The team also modeled a scenario in which chemoimmunotherapy was the standard of care throughout the entire time period and compared the costs between these scenarios.
The model projects that:
- Per-patient lifetime costs for CLL treatment will increase from $147,000 to $604,000 from 2016 onward
- The total out-of-pocket costs for Medicare patients will increase from $9200 to $57,000 for patients initiating treatment from 2016 onward
- The total annual cost of CLL management in the US will rise from $0.74 billion in 2011 to $5.13 billion in 2025, an increase of 590%.
“Such substantial increases in the cost are mainly driven by high drug prices, prolonged treatment duration, and the increase in the number of patients living with CLL,” said study author Qiushi Chen, PhD, of Massachusetts General Hospital.
The investigators also noted that the standard measure used to determine the cost-effectiveness of a medical intervention is whether it costs less than $100,000 for each additional year of life gained. The projected cost-effectiveness ratio of oral targeted therapy in CLL is $189,000 for each year gained.
“At the current average wholesale prices, oral targeted therapies for CLL are not cost-effective, and prices would need to drop by 50% to 70% to become cost-effective,” Dr Chhatwal said.
“We are not recommending that clinicians choose less effective CLL management strategies that do not include oral targeted therapies,” said study author Nitin Jain, MD, of the University of Texas MD Anderson Cancer Center in Houston.
“Instead, we propose that the prices of these drugs need to be reduced to make the treatment cost-effective and more affordable, something we hope may happen with all cancer drugs. We also believe more research is needed to explore whether we can discontinue targeted treatment of patients who have responded well without risking worsening of their health.”
New research suggests the increasing use of oral targeted therapies for chronic lymphocytic leukemia (CLL) could raise US treatment costs for the disease by almost 600%.
Investigators modeled the evolving management of CLL from 2011 to 2025 and found that increasing use of the oral targeted therapies ibrutinib and idelalisib could greatly increase costs for both patients and payers.
The team detailed these findings in the Journal of Clinical Oncology.
“The rising cost of cancer care is a serious concern,” said study author Jagpreet Chhatwal, PhD, of Massachusetts General Hospital in Boston.
“The average cost of annual cancer treatment, which was below $10,000 per patient before 2000, has now increased to more than $100,000. Such increasing trends can limit access to new therapies, potentially undermining their clinical effectiveness. These new drugs are highly effective, but their high costs motivated us to project their changing economic burden and affordability.”
Dr Chhatwal and his colleagues noted that ibrutinib and idelalisib each cost around $130,000 per year, and treatment with these drugs may be continued indefinitely.
So the team set out to determine the potential financial impact of the drugs on payers’ budgets, as well as on Medicare-enrolled patients, who represent the majority of CLL patients in the US.
The investigators developed a model to simulate the evolving management of CLL from 2011 to 2025.
In one scenario, chemoimmunotherapy was the standard of care before 2014, while oral targeted therapies were used for patients with del(17p) and relapsed CLL from 2014 onward and for first-line treatment of CLL from 2016 onward.
The team also modeled a scenario in which chemoimmunotherapy was the standard of care throughout the entire time period and compared the costs between these scenarios.
The model projects that:
- Per-patient lifetime costs for CLL treatment will increase from $147,000 to $604,000 from 2016 onward
- The total out-of-pocket costs for Medicare patients will increase from $9200 to $57,000 for patients initiating treatment from 2016 onward
- The total annual cost of CLL management in the US will rise from $0.74 billion in 2011 to $5.13 billion in 2025, an increase of 590%.
“Such substantial increases in the cost are mainly driven by high drug prices, prolonged treatment duration, and the increase in the number of patients living with CLL,” said study author Qiushi Chen, PhD, of Massachusetts General Hospital.
The investigators also noted that the standard measure used to determine the cost-effectiveness of a medical intervention is whether it costs less than $100,000 for each additional year of life gained. The projected cost-effectiveness ratio of oral targeted therapy in CLL is $189,000 for each year gained.
“At the current average wholesale prices, oral targeted therapies for CLL are not cost-effective, and prices would need to drop by 50% to 70% to become cost-effective,” Dr Chhatwal said.
“We are not recommending that clinicians choose less effective CLL management strategies that do not include oral targeted therapies,” said study author Nitin Jain, MD, of the University of Texas MD Anderson Cancer Center in Houston.
“Instead, we propose that the prices of these drugs need to be reduced to make the treatment cost-effective and more affordable, something we hope may happen with all cancer drugs. We also believe more research is needed to explore whether we can discontinue targeted treatment of patients who have responded well without risking worsening of their health.”
New research suggests the increasing use of oral targeted therapies for chronic lymphocytic leukemia (CLL) could raise US treatment costs for the disease by almost 600%.
Investigators modeled the evolving management of CLL from 2011 to 2025 and found that increasing use of the oral targeted therapies ibrutinib and idelalisib could greatly increase costs for both patients and payers.
The team detailed these findings in the Journal of Clinical Oncology.
“The rising cost of cancer care is a serious concern,” said study author Jagpreet Chhatwal, PhD, of Massachusetts General Hospital in Boston.
“The average cost of annual cancer treatment, which was below $10,000 per patient before 2000, has now increased to more than $100,000. Such increasing trends can limit access to new therapies, potentially undermining their clinical effectiveness. These new drugs are highly effective, but their high costs motivated us to project their changing economic burden and affordability.”
Dr Chhatwal and his colleagues noted that ibrutinib and idelalisib each cost around $130,000 per year, and treatment with these drugs may be continued indefinitely.
So the team set out to determine the potential financial impact of the drugs on payers’ budgets, as well as on Medicare-enrolled patients, who represent the majority of CLL patients in the US.
The investigators developed a model to simulate the evolving management of CLL from 2011 to 2025.
In one scenario, chemoimmunotherapy was the standard of care before 2014, while oral targeted therapies were used for patients with del(17p) and relapsed CLL from 2014 onward and for first-line treatment of CLL from 2016 onward.
The team also modeled a scenario in which chemoimmunotherapy was the standard of care throughout the entire time period and compared the costs between these scenarios.
The model projects that:
- Per-patient lifetime costs for CLL treatment will increase from $147,000 to $604,000 from 2016 onward
- The total out-of-pocket costs for Medicare patients will increase from $9200 to $57,000 for patients initiating treatment from 2016 onward
- The total annual cost of CLL management in the US will rise from $0.74 billion in 2011 to $5.13 billion in 2025, an increase of 590%.
“Such substantial increases in the cost are mainly driven by high drug prices, prolonged treatment duration, and the increase in the number of patients living with CLL,” said study author Qiushi Chen, PhD, of Massachusetts General Hospital.
The investigators also noted that the standard measure used to determine the cost-effectiveness of a medical intervention is whether it costs less than $100,000 for each additional year of life gained. The projected cost-effectiveness ratio of oral targeted therapy in CLL is $189,000 for each year gained.
“At the current average wholesale prices, oral targeted therapies for CLL are not cost-effective, and prices would need to drop by 50% to 70% to become cost-effective,” Dr Chhatwal said.
“We are not recommending that clinicians choose less effective CLL management strategies that do not include oral targeted therapies,” said study author Nitin Jain, MD, of the University of Texas MD Anderson Cancer Center in Houston.
“Instead, we propose that the prices of these drugs need to be reduced to make the treatment cost-effective and more affordable, something we hope may happen with all cancer drugs. We also believe more research is needed to explore whether we can discontinue targeted treatment of patients who have responded well without risking worsening of their health.”
Which Health Care Providers Are Most Likely to Get Vaccinated?
Making it easier for employees to get free flu vaccinations on site—and requiring vaccinations—has helped bump up coverage, according to an online survey conducted for the CDC.
Of 2,316 health care personnel who responded, 79% reported having gotten a flu shot for the 2015-2016 season, up 15.5 percentage points from the 2010-2011 estimate but similar to the 77.3% coverage for 2014-2015.
Physicians are most likely to get vaccinated (95.6%), whereas assistants and aides have the lowest coverage, although it was well above half (64.1%). Nurse practitioners and physician assistants also had a high rate of vaccination (90.3%), followed by nurses (90.1%) and pharmacists (86.5%), and Allied health professionals/technicians/technologists (84.7%).
Related: The Ads Say ‘Get Your Flu Shot Today,’ But It May Be Wiser To Wait
Coverage among staff in long-term care settings was up—from 63.9% in 2014-2015 to 69.2% for 2015-2016—but still consistently lower than the coverage in hospitals and ambulatory care. Coverage in those settings was similar in both seasons. Employer requirements “likely contributed” to the gradual increase in vaccination among health care staff in settings with lowest coverage, the researchers say.
In facilities where vaccination was required, coverage was nearly total (96.5%). But only 61% of health care personnel work in hospitals with vaccination requirements—and that’s at least 27 percentage points higher than the proportion in any other work setting, the researchers say. Aides and assistants reported the lowest prevalence of vaccination requirements (22.5%).
Related: New Vaccination Data & Trends
Next to requirements, cost influenced vaccination response. The majority of vaccinated health care staff got the shots at their workplace. Coverage was highest when free vaccination was available on-site for a day or more.
To boost vaccination among long-term care staff, the CDC and the National Vaccine Program Office offer a web-based tool kit that includes access to resources, strategies, and educational material (www.cdc/gov/flu/toolkit/long-term-care/index.htm). Employers and health care administrators can also check out the Guide to Community Preventive Services, which presents evidence to support on-site vaccination at no or low cost.
Related: Health Care Providers Impact on HPV Vaccination Rates
Source:
Black CL, Yue X, Ball SW, et al. MMWR. 2016;65(38):1026-1031.
Making it easier for employees to get free flu vaccinations on site—and requiring vaccinations—has helped bump up coverage, according to an online survey conducted for the CDC.
Of 2,316 health care personnel who responded, 79% reported having gotten a flu shot for the 2015-2016 season, up 15.5 percentage points from the 2010-2011 estimate but similar to the 77.3% coverage for 2014-2015.
Physicians are most likely to get vaccinated (95.6%), whereas assistants and aides have the lowest coverage, although it was well above half (64.1%). Nurse practitioners and physician assistants also had a high rate of vaccination (90.3%), followed by nurses (90.1%) and pharmacists (86.5%), and Allied health professionals/technicians/technologists (84.7%).
Related: The Ads Say ‘Get Your Flu Shot Today,’ But It May Be Wiser To Wait
Coverage among staff in long-term care settings was up—from 63.9% in 2014-2015 to 69.2% for 2015-2016—but still consistently lower than the coverage in hospitals and ambulatory care. Coverage in those settings was similar in both seasons. Employer requirements “likely contributed” to the gradual increase in vaccination among health care staff in settings with lowest coverage, the researchers say.
In facilities where vaccination was required, coverage was nearly total (96.5%). But only 61% of health care personnel work in hospitals with vaccination requirements—and that’s at least 27 percentage points higher than the proportion in any other work setting, the researchers say. Aides and assistants reported the lowest prevalence of vaccination requirements (22.5%).
Related: New Vaccination Data & Trends
Next to requirements, cost influenced vaccination response. The majority of vaccinated health care staff got the shots at their workplace. Coverage was highest when free vaccination was available on-site for a day or more.
To boost vaccination among long-term care staff, the CDC and the National Vaccine Program Office offer a web-based tool kit that includes access to resources, strategies, and educational material (www.cdc/gov/flu/toolkit/long-term-care/index.htm). Employers and health care administrators can also check out the Guide to Community Preventive Services, which presents evidence to support on-site vaccination at no or low cost.
Related: Health Care Providers Impact on HPV Vaccination Rates
Source:
Black CL, Yue X, Ball SW, et al. MMWR. 2016;65(38):1026-1031.
Making it easier for employees to get free flu vaccinations on site—and requiring vaccinations—has helped bump up coverage, according to an online survey conducted for the CDC.
Of 2,316 health care personnel who responded, 79% reported having gotten a flu shot for the 2015-2016 season, up 15.5 percentage points from the 2010-2011 estimate but similar to the 77.3% coverage for 2014-2015.
Physicians are most likely to get vaccinated (95.6%), whereas assistants and aides have the lowest coverage, although it was well above half (64.1%). Nurse practitioners and physician assistants also had a high rate of vaccination (90.3%), followed by nurses (90.1%) and pharmacists (86.5%), and Allied health professionals/technicians/technologists (84.7%).
Related: The Ads Say ‘Get Your Flu Shot Today,’ But It May Be Wiser To Wait
Coverage among staff in long-term care settings was up—from 63.9% in 2014-2015 to 69.2% for 2015-2016—but still consistently lower than the coverage in hospitals and ambulatory care. Coverage in those settings was similar in both seasons. Employer requirements “likely contributed” to the gradual increase in vaccination among health care staff in settings with lowest coverage, the researchers say.
In facilities where vaccination was required, coverage was nearly total (96.5%). But only 61% of health care personnel work in hospitals with vaccination requirements—and that’s at least 27 percentage points higher than the proportion in any other work setting, the researchers say. Aides and assistants reported the lowest prevalence of vaccination requirements (22.5%).
Related: New Vaccination Data & Trends
Next to requirements, cost influenced vaccination response. The majority of vaccinated health care staff got the shots at their workplace. Coverage was highest when free vaccination was available on-site for a day or more.
To boost vaccination among long-term care staff, the CDC and the National Vaccine Program Office offer a web-based tool kit that includes access to resources, strategies, and educational material (www.cdc/gov/flu/toolkit/long-term-care/index.htm). Employers and health care administrators can also check out the Guide to Community Preventive Services, which presents evidence to support on-site vaccination at no or low cost.
Related: Health Care Providers Impact on HPV Vaccination Rates
Source:
Black CL, Yue X, Ball SW, et al. MMWR. 2016;65(38):1026-1031.
IHS Funds Programs to Protect Native Youth from Substance Abuse
The IHS announced 42 new awards to promote best practice strategies for preventing suicide and substance abuse, incorporating culturally appropriate approaches that are effective for tribal communities.
The awards, totaling more than $7 million for 1 year, are specifically for Methamphetamine and Suicide Prevention Initiative (MSPI) funding. The award recipients focus on boosting positive youth development, fostering resiliency, and promoting family engagement among Native youth, the IHS says. “We know that protective factors provided through caring adults, traditional practices, and Native language and culture help offset negative outcomes and foster the long-term development of resilience,” said IHS Principal Deputy Director Mary Smith, in announcing the awards.
Current funded projects include the Ohkay Owingeh MSPI Project in New Mexico. The evidence- and practice-based prevention program, conducted by the local Boys and Girls Club, “strongly focuses” on the issues surrounding methamphetamine and other drugs and self-harm in Native communities.
Another funded program, Fresno American Indian Health Project, targets Native youth at risk for substance abuse and suicide in the San Francisco Bay Area. The Stronghold Project II after-school programs help to strengthen cultural systems and family capacity, addressing family violence and suicide due to substance abuse.
From 2009 through 2015, MSPI supported > 12,200 people entering treatment for methamphetamine abuse, plus > 16,560 substance use and mental health disorder encounters via telehealth. The funding also supported training nearly 17,000 professionals and community members in suicide crisis response, with nearly 700,000 encounters with youth through prevention activities. The recently announced awards build on the more than $13 million awarded in 2015.
The IHS announced 42 new awards to promote best practice strategies for preventing suicide and substance abuse, incorporating culturally appropriate approaches that are effective for tribal communities.
The awards, totaling more than $7 million for 1 year, are specifically for Methamphetamine and Suicide Prevention Initiative (MSPI) funding. The award recipients focus on boosting positive youth development, fostering resiliency, and promoting family engagement among Native youth, the IHS says. “We know that protective factors provided through caring adults, traditional practices, and Native language and culture help offset negative outcomes and foster the long-term development of resilience,” said IHS Principal Deputy Director Mary Smith, in announcing the awards.
Current funded projects include the Ohkay Owingeh MSPI Project in New Mexico. The evidence- and practice-based prevention program, conducted by the local Boys and Girls Club, “strongly focuses” on the issues surrounding methamphetamine and other drugs and self-harm in Native communities.
Another funded program, Fresno American Indian Health Project, targets Native youth at risk for substance abuse and suicide in the San Francisco Bay Area. The Stronghold Project II after-school programs help to strengthen cultural systems and family capacity, addressing family violence and suicide due to substance abuse.
From 2009 through 2015, MSPI supported > 12,200 people entering treatment for methamphetamine abuse, plus > 16,560 substance use and mental health disorder encounters via telehealth. The funding also supported training nearly 17,000 professionals and community members in suicide crisis response, with nearly 700,000 encounters with youth through prevention activities. The recently announced awards build on the more than $13 million awarded in 2015.
The IHS announced 42 new awards to promote best practice strategies for preventing suicide and substance abuse, incorporating culturally appropriate approaches that are effective for tribal communities.
The awards, totaling more than $7 million for 1 year, are specifically for Methamphetamine and Suicide Prevention Initiative (MSPI) funding. The award recipients focus on boosting positive youth development, fostering resiliency, and promoting family engagement among Native youth, the IHS says. “We know that protective factors provided through caring adults, traditional practices, and Native language and culture help offset negative outcomes and foster the long-term development of resilience,” said IHS Principal Deputy Director Mary Smith, in announcing the awards.
Current funded projects include the Ohkay Owingeh MSPI Project in New Mexico. The evidence- and practice-based prevention program, conducted by the local Boys and Girls Club, “strongly focuses” on the issues surrounding methamphetamine and other drugs and self-harm in Native communities.
Another funded program, Fresno American Indian Health Project, targets Native youth at risk for substance abuse and suicide in the San Francisco Bay Area. The Stronghold Project II after-school programs help to strengthen cultural systems and family capacity, addressing family violence and suicide due to substance abuse.
From 2009 through 2015, MSPI supported > 12,200 people entering treatment for methamphetamine abuse, plus > 16,560 substance use and mental health disorder encounters via telehealth. The funding also supported training nearly 17,000 professionals and community members in suicide crisis response, with nearly 700,000 encounters with youth through prevention activities. The recently announced awards build on the more than $13 million awarded in 2015.
Most children’s hypertension goes undiagnosed and untreated
Twenty-three percent of children with hypertension and 10% of those with prehypertension were diagnosed by clinicians, based on data from a retrospective study of more than 398,000 children in the United States.
In addition, only 6% of children who met criteria for hypertension received treatment within a year of their diagnosis.
The researchers reviewed data from 398,079 children and adolescents aged 3-18 years who were part of the Comparative Effectiveness Research Through Collaborative Electronic Reporting Consortium. The patients had at least three visits with blood pressure and height measured.
The final study population included 12,138 children with hypertension at 44 sites and 38,874 children with prehypertension at 77 sites. Of the children with hypertension, 23% had hypertension or abnormal blood pressure diagnosis in their electronic health record (EHR). In addition, 32% of 4,996 children with stage 2 hypertension had an EHR diagnosis. A diagnosis was more likely in children who were male, taller, older, heavier, had at least one blood pressure measurement in the stage 2 range, or who had additional measurements beyond the three needed for a diagnosis.
Of the children with prehypertension, 10% had a diagnosis of hypertension or abnormal blood pressure in their EHRs. Diagnosis was more common among males and those who were older, heavier, taller, had more than one blood pressure measurement in the stage 2 range or had additional readings beyond those needed for diagnosis, the investigators said.
Of 2,813 pediatric patients who met criteria for hypertension and continued to have high blood pressure readings, only 6% were prescribed medication within 12 months of diagnosis. The average age for medication initiation was almost 14 years, and the most commonly prescribed medications were angiotensin-converting-enzyme inhibitors or angiotensin-II-receptor blockers for 35% of children, diuretics for 22%, calcium channel blockers for 17%, and beta blockers for 10%, Dr. Kaelber and his associates reported.
The study results were limited by several factors, among them the reliance on ICD-9 codes for identification of abnormal blood pressure and inclusion of all reasons (preventive care and nonpreventive care) for visits to primary care pediatric providers. The findings, however, suggest that “intervention is needed to help pediatric primary care clinicians recognize and treat hypertension and prehypertension,” they wrote.
Funding was provided by U.S. Department of Health & Human Services grants and by the Eunice Kennedy Shriver National Institute of Child Health and Human Development. The researchers had no relevant financial disclosures.
“Childhood hypertension is associated with increased carotid intima media thickness, increased left ventricular mass, and increased arterial stiffness, all precursors to adverse cardiovascular outcomes in adulthood,” wrote Kevin D. Hill, MD, and Jennifer S. Li., MD, in an accompanying editorial (Pediatrics 2016 Nov 22;138:e20162857. doi: 10.1542/peds.2016-2857).
Although some may question the study findings, “there are compelling reasons to believe the results,” in part because the study’s definition of hypertension and reported 3.3% hypertension rate are consistent with current guidelines and previous studies, they noted.
“Hypertension is indeed more challenging to diagnose in children because of age, sex, and height-related variability in blood pressure norms,” they said. Evaluation of blood pressure percentiles, however, should be routine in pediatric practice.
More research is needed, including head-to-head comparisons of drugs and assessments of lifestyle interventions, the editorialists said. However, “it is clear that childhood hypertension is a major public health concern. The clinical manifestations may be silent during childhood, but this should not deter early diagnosis and treatment.”
Dr. Hill and Dr. Li are with Duke University in Durham, N.C. They had no financial conflicts to disclose. Their work was supported in part by the National Institutes of Health.
“Childhood hypertension is associated with increased carotid intima media thickness, increased left ventricular mass, and increased arterial stiffness, all precursors to adverse cardiovascular outcomes in adulthood,” wrote Kevin D. Hill, MD, and Jennifer S. Li., MD, in an accompanying editorial (Pediatrics 2016 Nov 22;138:e20162857. doi: 10.1542/peds.2016-2857).
Although some may question the study findings, “there are compelling reasons to believe the results,” in part because the study’s definition of hypertension and reported 3.3% hypertension rate are consistent with current guidelines and previous studies, they noted.
“Hypertension is indeed more challenging to diagnose in children because of age, sex, and height-related variability in blood pressure norms,” they said. Evaluation of blood pressure percentiles, however, should be routine in pediatric practice.
More research is needed, including head-to-head comparisons of drugs and assessments of lifestyle interventions, the editorialists said. However, “it is clear that childhood hypertension is a major public health concern. The clinical manifestations may be silent during childhood, but this should not deter early diagnosis and treatment.”
Dr. Hill and Dr. Li are with Duke University in Durham, N.C. They had no financial conflicts to disclose. Their work was supported in part by the National Institutes of Health.
“Childhood hypertension is associated with increased carotid intima media thickness, increased left ventricular mass, and increased arterial stiffness, all precursors to adverse cardiovascular outcomes in adulthood,” wrote Kevin D. Hill, MD, and Jennifer S. Li., MD, in an accompanying editorial (Pediatrics 2016 Nov 22;138:e20162857. doi: 10.1542/peds.2016-2857).
Although some may question the study findings, “there are compelling reasons to believe the results,” in part because the study’s definition of hypertension and reported 3.3% hypertension rate are consistent with current guidelines and previous studies, they noted.
“Hypertension is indeed more challenging to diagnose in children because of age, sex, and height-related variability in blood pressure norms,” they said. Evaluation of blood pressure percentiles, however, should be routine in pediatric practice.
More research is needed, including head-to-head comparisons of drugs and assessments of lifestyle interventions, the editorialists said. However, “it is clear that childhood hypertension is a major public health concern. The clinical manifestations may be silent during childhood, but this should not deter early diagnosis and treatment.”
Dr. Hill and Dr. Li are with Duke University in Durham, N.C. They had no financial conflicts to disclose. Their work was supported in part by the National Institutes of Health.
Twenty-three percent of children with hypertension and 10% of those with prehypertension were diagnosed by clinicians, based on data from a retrospective study of more than 398,000 children in the United States.
In addition, only 6% of children who met criteria for hypertension received treatment within a year of their diagnosis.
The researchers reviewed data from 398,079 children and adolescents aged 3-18 years who were part of the Comparative Effectiveness Research Through Collaborative Electronic Reporting Consortium. The patients had at least three visits with blood pressure and height measured.
The final study population included 12,138 children with hypertension at 44 sites and 38,874 children with prehypertension at 77 sites. Of the children with hypertension, 23% had hypertension or abnormal blood pressure diagnosis in their electronic health record (EHR). In addition, 32% of 4,996 children with stage 2 hypertension had an EHR diagnosis. A diagnosis was more likely in children who were male, taller, older, heavier, had at least one blood pressure measurement in the stage 2 range, or who had additional measurements beyond the three needed for a diagnosis.
Of the children with prehypertension, 10% had a diagnosis of hypertension or abnormal blood pressure in their EHRs. Diagnosis was more common among males and those who were older, heavier, taller, had more than one blood pressure measurement in the stage 2 range or had additional readings beyond those needed for diagnosis, the investigators said.
Of 2,813 pediatric patients who met criteria for hypertension and continued to have high blood pressure readings, only 6% were prescribed medication within 12 months of diagnosis. The average age for medication initiation was almost 14 years, and the most commonly prescribed medications were angiotensin-converting-enzyme inhibitors or angiotensin-II-receptor blockers for 35% of children, diuretics for 22%, calcium channel blockers for 17%, and beta blockers for 10%, Dr. Kaelber and his associates reported.
The study results were limited by several factors, among them the reliance on ICD-9 codes for identification of abnormal blood pressure and inclusion of all reasons (preventive care and nonpreventive care) for visits to primary care pediatric providers. The findings, however, suggest that “intervention is needed to help pediatric primary care clinicians recognize and treat hypertension and prehypertension,” they wrote.
Funding was provided by U.S. Department of Health & Human Services grants and by the Eunice Kennedy Shriver National Institute of Child Health and Human Development. The researchers had no relevant financial disclosures.
Twenty-three percent of children with hypertension and 10% of those with prehypertension were diagnosed by clinicians, based on data from a retrospective study of more than 398,000 children in the United States.
In addition, only 6% of children who met criteria for hypertension received treatment within a year of their diagnosis.
The researchers reviewed data from 398,079 children and adolescents aged 3-18 years who were part of the Comparative Effectiveness Research Through Collaborative Electronic Reporting Consortium. The patients had at least three visits with blood pressure and height measured.
The final study population included 12,138 children with hypertension at 44 sites and 38,874 children with prehypertension at 77 sites. Of the children with hypertension, 23% had hypertension or abnormal blood pressure diagnosis in their electronic health record (EHR). In addition, 32% of 4,996 children with stage 2 hypertension had an EHR diagnosis. A diagnosis was more likely in children who were male, taller, older, heavier, had at least one blood pressure measurement in the stage 2 range, or who had additional measurements beyond the three needed for a diagnosis.
Of the children with prehypertension, 10% had a diagnosis of hypertension or abnormal blood pressure in their EHRs. Diagnosis was more common among males and those who were older, heavier, taller, had more than one blood pressure measurement in the stage 2 range or had additional readings beyond those needed for diagnosis, the investigators said.
Of 2,813 pediatric patients who met criteria for hypertension and continued to have high blood pressure readings, only 6% were prescribed medication within 12 months of diagnosis. The average age for medication initiation was almost 14 years, and the most commonly prescribed medications were angiotensin-converting-enzyme inhibitors or angiotensin-II-receptor blockers for 35% of children, diuretics for 22%, calcium channel blockers for 17%, and beta blockers for 10%, Dr. Kaelber and his associates reported.
The study results were limited by several factors, among them the reliance on ICD-9 codes for identification of abnormal blood pressure and inclusion of all reasons (preventive care and nonpreventive care) for visits to primary care pediatric providers. The findings, however, suggest that “intervention is needed to help pediatric primary care clinicians recognize and treat hypertension and prehypertension,” they wrote.
Funding was provided by U.S. Department of Health & Human Services grants and by the Eunice Kennedy Shriver National Institute of Child Health and Human Development. The researchers had no relevant financial disclosures.
FROM PEDIATRICS
Key clinical point: Pediatric hypertension is underdiagnosed, and medication guidelines are inconsistently followed.
Major finding: Only 23% of children with hypertension and 10% of those with prehypertension were diagnosed by clinicians; 6% of those with hypertension were prescribed medication.
Data source: A retrospective cohort study including 398,079 pediatric patients from 196 clinics in 27 states.
Disclosures: Funding was provided by U.S. Department of Health & Human Services grants and by the Eunice Kennedy Shriver National Institute of Child Health and Human Development. The researchers had no relevant financial disclosures.
VIDEO: Urate lowering therapy improved kidney function
WASHINGTON – Urate lowering therapy improved kidney function in patients with chronic kidney disease (CKD), according to a large retrospective study presented at the annual meeting of the American College of Rheumatology. Moreover, patients with CKD stage 3 derived the most benefit from urate lowering therapy, and those with CKD stage 2 also benefited to a lesser degree. Patients with CKD stage 4 had no benefit from urate lowering therapy.
“Two years ago we showed that urate lowering therapy did not worsen kidney function in patients with chronic kidney disease. This study shows that their kidney function improved [with urate lowering therapy],” said Gerald D. Levy, MD, MBA, a rheumatologist at Kaiser Permanente of Southern California, Downey, Calif.
The study was conducted from 2008 to 2014 and included 12,751 patients with serum urate levels of above 7 mg/dL and CKD Stages 2, 3, and 4 at the index date, defined as the first time this test result was reported. Patients were drawn from the Kaiser Permanente database and were treated by primary care physicians. Patients were followed for 1 year from the index date. The primary outcome measure was a 30% increase or a 30% decrease in glomerular filtration rate (GFR) from baseline to the last available result.
Of the 12,751 patients, 2,690 were on urate lowering therapy and 10,061 were not on urate lowering therapy. Goal serum urate (sUA) was achieved in 1,118 (42%) of patients on urate lowering therapy. Among patients who achieved goal sUA, a 30% improvement in GFR was observed in 17.1% versus 10.4% of patients who did not achieve sUA goal, for an absolute difference of 6.7% (P less than .001).
For patients at goal versus those not at goal, the ratio of improvement was 3.4 and 3.8, respectively.
“This study suggests that patients with CKD should be tested for uric acid independent of whether they have gout or not. Getting to goal is important. Stage 3 CKD is the sweet spot where patients got the most pronounced benefit from urate lowering therapy,” he stated. “Stage 4 CKD is too late.”
Dr. Levy discussed the findings in a video interview during the meeting.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
WASHINGTON – Urate lowering therapy improved kidney function in patients with chronic kidney disease (CKD), according to a large retrospective study presented at the annual meeting of the American College of Rheumatology. Moreover, patients with CKD stage 3 derived the most benefit from urate lowering therapy, and those with CKD stage 2 also benefited to a lesser degree. Patients with CKD stage 4 had no benefit from urate lowering therapy.
“Two years ago we showed that urate lowering therapy did not worsen kidney function in patients with chronic kidney disease. This study shows that their kidney function improved [with urate lowering therapy],” said Gerald D. Levy, MD, MBA, a rheumatologist at Kaiser Permanente of Southern California, Downey, Calif.
The study was conducted from 2008 to 2014 and included 12,751 patients with serum urate levels of above 7 mg/dL and CKD Stages 2, 3, and 4 at the index date, defined as the first time this test result was reported. Patients were drawn from the Kaiser Permanente database and were treated by primary care physicians. Patients were followed for 1 year from the index date. The primary outcome measure was a 30% increase or a 30% decrease in glomerular filtration rate (GFR) from baseline to the last available result.
Of the 12,751 patients, 2,690 were on urate lowering therapy and 10,061 were not on urate lowering therapy. Goal serum urate (sUA) was achieved in 1,118 (42%) of patients on urate lowering therapy. Among patients who achieved goal sUA, a 30% improvement in GFR was observed in 17.1% versus 10.4% of patients who did not achieve sUA goal, for an absolute difference of 6.7% (P less than .001).
For patients at goal versus those not at goal, the ratio of improvement was 3.4 and 3.8, respectively.
“This study suggests that patients with CKD should be tested for uric acid independent of whether they have gout or not. Getting to goal is important. Stage 3 CKD is the sweet spot where patients got the most pronounced benefit from urate lowering therapy,” he stated. “Stage 4 CKD is too late.”
Dr. Levy discussed the findings in a video interview during the meeting.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
WASHINGTON – Urate lowering therapy improved kidney function in patients with chronic kidney disease (CKD), according to a large retrospective study presented at the annual meeting of the American College of Rheumatology. Moreover, patients with CKD stage 3 derived the most benefit from urate lowering therapy, and those with CKD stage 2 also benefited to a lesser degree. Patients with CKD stage 4 had no benefit from urate lowering therapy.
“Two years ago we showed that urate lowering therapy did not worsen kidney function in patients with chronic kidney disease. This study shows that their kidney function improved [with urate lowering therapy],” said Gerald D. Levy, MD, MBA, a rheumatologist at Kaiser Permanente of Southern California, Downey, Calif.
The study was conducted from 2008 to 2014 and included 12,751 patients with serum urate levels of above 7 mg/dL and CKD Stages 2, 3, and 4 at the index date, defined as the first time this test result was reported. Patients were drawn from the Kaiser Permanente database and were treated by primary care physicians. Patients were followed for 1 year from the index date. The primary outcome measure was a 30% increase or a 30% decrease in glomerular filtration rate (GFR) from baseline to the last available result.
Of the 12,751 patients, 2,690 were on urate lowering therapy and 10,061 were not on urate lowering therapy. Goal serum urate (sUA) was achieved in 1,118 (42%) of patients on urate lowering therapy. Among patients who achieved goal sUA, a 30% improvement in GFR was observed in 17.1% versus 10.4% of patients who did not achieve sUA goal, for an absolute difference of 6.7% (P less than .001).
For patients at goal versus those not at goal, the ratio of improvement was 3.4 and 3.8, respectively.
“This study suggests that patients with CKD should be tested for uric acid independent of whether they have gout or not. Getting to goal is important. Stage 3 CKD is the sweet spot where patients got the most pronounced benefit from urate lowering therapy,” he stated. “Stage 4 CKD is too late.”
Dr. Levy discussed the findings in a video interview during the meeting.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
AT THE ACR ANNUAL MEETING
Tinzaparin is a safe, effective anticoagulant in patients on dialysis
CHICAGO – Tinzaparin was safe and effective as an anticoagulant for hemodialysis patients based on results from the Intermittent Hemodialysis Anticoagulation with Tinzaparin (HEMO-TIN) trial presented at the annual meeting sponsored by the American Society for Nephrology.
In the multicenter randomized controlled trial of 192 adults on hemodialysis, tinzaparin, a low molecular weight heparin with antithrombotic properties, was compared with unfractionated heparin. Tinzaparin has been considered for hemodialysis patients because it is thought to be less dependent on renal clearance than are other low molecular weight heparins, Christine Ribic, MD, MSc, of McMaster University, Hamilton, Ont., said in reporting the results.
After 3 months, the 78 patients remaining in the tinzaparin group crossed over to receive unfractionated heparin for 3 months. The 79 patients remaining in the unfractionated heparin group crossed over to receive tinzaparin for 3 months. Of these 156 patients, 125 completed the 3-month crossover phase.
There were 421 bleeding events in the 12,125 hemodialysis sessions studied. They were evenly distributed in the groups, with 212 (50.4%) in those receiving unfractionated heparin and 209 (49.6%) in those receiving tinzaparin. The prevalence of major bleeds (2.1 vs 1.6%), clinically important nonmajor bleeds (1.2% vs 0.2%), and minor bleeds (47.0% vs 47.7%) was also similar between the unfractionated heparin and tinzaparin groups.
Anti-Xa heparin levels were used as a surrogate measure of low molecular weight heparin activity levels and bleeding risk due to bioaccumulation. In tinzaparin-treated patients, anti-Xa heparin levels never exceeded a value of 0.2 either before or after dialysis. This value was considered the threshold between safety and increased risk for bleeding. This threshold level was routinely exceeded pre- and post-dialysis in patients receiving unfractionated heparin at baseline and both before and after crossover.
Grade 4 clotting was similar for tinzaparin and unfractionated heparin, occurring in 23 of 6,095 (0.4%) unfractionated heparin hemodialysis sessions and 41 of 6030 (0.7%) tinzaparin hemodialysis sessions. Mean dialyzer clotting scores and mean air trap clotting scores were also comparable.
The trial was supported by Leo Pharma, the maker of tinzaparin (innohep), in collaboration with McMaster University. Dr. Ribic is the sponsor of the trial.
CHICAGO – Tinzaparin was safe and effective as an anticoagulant for hemodialysis patients based on results from the Intermittent Hemodialysis Anticoagulation with Tinzaparin (HEMO-TIN) trial presented at the annual meeting sponsored by the American Society for Nephrology.
In the multicenter randomized controlled trial of 192 adults on hemodialysis, tinzaparin, a low molecular weight heparin with antithrombotic properties, was compared with unfractionated heparin. Tinzaparin has been considered for hemodialysis patients because it is thought to be less dependent on renal clearance than are other low molecular weight heparins, Christine Ribic, MD, MSc, of McMaster University, Hamilton, Ont., said in reporting the results.
After 3 months, the 78 patients remaining in the tinzaparin group crossed over to receive unfractionated heparin for 3 months. The 79 patients remaining in the unfractionated heparin group crossed over to receive tinzaparin for 3 months. Of these 156 patients, 125 completed the 3-month crossover phase.
There were 421 bleeding events in the 12,125 hemodialysis sessions studied. They were evenly distributed in the groups, with 212 (50.4%) in those receiving unfractionated heparin and 209 (49.6%) in those receiving tinzaparin. The prevalence of major bleeds (2.1 vs 1.6%), clinically important nonmajor bleeds (1.2% vs 0.2%), and minor bleeds (47.0% vs 47.7%) was also similar between the unfractionated heparin and tinzaparin groups.
Anti-Xa heparin levels were used as a surrogate measure of low molecular weight heparin activity levels and bleeding risk due to bioaccumulation. In tinzaparin-treated patients, anti-Xa heparin levels never exceeded a value of 0.2 either before or after dialysis. This value was considered the threshold between safety and increased risk for bleeding. This threshold level was routinely exceeded pre- and post-dialysis in patients receiving unfractionated heparin at baseline and both before and after crossover.
Grade 4 clotting was similar for tinzaparin and unfractionated heparin, occurring in 23 of 6,095 (0.4%) unfractionated heparin hemodialysis sessions and 41 of 6030 (0.7%) tinzaparin hemodialysis sessions. Mean dialyzer clotting scores and mean air trap clotting scores were also comparable.
The trial was supported by Leo Pharma, the maker of tinzaparin (innohep), in collaboration with McMaster University. Dr. Ribic is the sponsor of the trial.
CHICAGO – Tinzaparin was safe and effective as an anticoagulant for hemodialysis patients based on results from the Intermittent Hemodialysis Anticoagulation with Tinzaparin (HEMO-TIN) trial presented at the annual meeting sponsored by the American Society for Nephrology.
In the multicenter randomized controlled trial of 192 adults on hemodialysis, tinzaparin, a low molecular weight heparin with antithrombotic properties, was compared with unfractionated heparin. Tinzaparin has been considered for hemodialysis patients because it is thought to be less dependent on renal clearance than are other low molecular weight heparins, Christine Ribic, MD, MSc, of McMaster University, Hamilton, Ont., said in reporting the results.
After 3 months, the 78 patients remaining in the tinzaparin group crossed over to receive unfractionated heparin for 3 months. The 79 patients remaining in the unfractionated heparin group crossed over to receive tinzaparin for 3 months. Of these 156 patients, 125 completed the 3-month crossover phase.
There were 421 bleeding events in the 12,125 hemodialysis sessions studied. They were evenly distributed in the groups, with 212 (50.4%) in those receiving unfractionated heparin and 209 (49.6%) in those receiving tinzaparin. The prevalence of major bleeds (2.1 vs 1.6%), clinically important nonmajor bleeds (1.2% vs 0.2%), and minor bleeds (47.0% vs 47.7%) was also similar between the unfractionated heparin and tinzaparin groups.
Anti-Xa heparin levels were used as a surrogate measure of low molecular weight heparin activity levels and bleeding risk due to bioaccumulation. In tinzaparin-treated patients, anti-Xa heparin levels never exceeded a value of 0.2 either before or after dialysis. This value was considered the threshold between safety and increased risk for bleeding. This threshold level was routinely exceeded pre- and post-dialysis in patients receiving unfractionated heparin at baseline and both before and after crossover.
Grade 4 clotting was similar for tinzaparin and unfractionated heparin, occurring in 23 of 6,095 (0.4%) unfractionated heparin hemodialysis sessions and 41 of 6030 (0.7%) tinzaparin hemodialysis sessions. Mean dialyzer clotting scores and mean air trap clotting scores were also comparable.
The trial was supported by Leo Pharma, the maker of tinzaparin (innohep), in collaboration with McMaster University. Dr. Ribic is the sponsor of the trial.
AT KIDNEY WEEK 2016
Key clinical point: Tinzaparin outcomes compare to those with low molecular weight heparin, and tinzaparin may be safer because it is less dependent on renal clearance than are low molecular weight heparins.
Major finding: Mean anti-Xa levels post-hemodialysis did not exceed 0.2 for tinzaparin, indicating no residual anticoagulant effect.
Data source: Randomized, double-dummy, blinded crossover controlled trial involving 192 patients.
Disclosures: Study sponsor was McMaster University, Hamilton, Ont. The study was funded by Leo Pharma. Dr. Ribic reported having no financial disclosures.
Long-term remission maintenance in ANCA-associated vasculitis leans toward rituximab over azathioprine
WASHINGTON – Rituximab was superior to azathioprine as maintenance therapy for antineutrophil cytoplasmic antibody–associated vasculitis over long-term follow-up of the MAINRITSAN trial. At 60 months, rituximab significantly improved overall survival and relapse-free survival, compared with azathioprine.
At 60 months, overall survival (OS) rates were 100% for rituximab (Rituxan) versus 93% for azathioprine (P = .045), and relapse-free survival (RFS) rates were 57.9% versus 37.3%, respectively (P = .012).
These long-term maintenance results build on the primary results of MAINRITSAN that were previously published in 2014 (N Engl J Med. 2014;Nov 6;371[19]:1771-80). The primary results showed the superiority of rituximab maintenance therapy versus the then-gold standard azathioprine in maintaining antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis (AAV) remission at 28 months following induction therapy with a cyclophosphamide/glucocorticoid regimen.
“Following publication of the primary results, some uncertainties remained as to the duration of remission on rituximab. There is a need for therapy that can prevent relapse over the longer term,” lead author Benjamin Terrier, MD, of the National Referral Center for Rare Systemic Autoimmune Diseases at Cochin Hospital and Paris Descartes University in France, said in his presentation of the long-term follow-up data at the annual meeting of the American College of Rheumatology.
The follow-up results indicate that “over the long term, despite late relapses after the 28-month initial follow-up period, maintenance therapy with rituximab remained significantly superior to azathioprine to maintain remission at 60 months and was associated with better survival,” Dr. Terrier said.
The study included 115 newly diagnosed or relapsing patients with AAV (granulomatosis with polyangiitis, microscopic polyangiitis, or eosinophilic granulomatosis with polyangiitis). Of these, 80% were newly diagnosed. After achieving remission on induction therapy, patients were randomized to rituximab infusion 500 mg on day 1, day 15, and 5.5 months later, then every 6 months for 18 months or to azathioprine for 22 months.
The investigators collected data prospectively on major and minor relapses and adverse events, using a Q-TWIST (Quality Adjusted Time Without Symptoms and Toxicity) analysis to show the trade-off between toxicity and disease activity.
For all relapses, major and minor, rituximab maintained superiority over azathioprine at 60 months. There were 24 events in the rituximab arm: 11 minor relapses and 13 major relapses. There were 36 events in the azathioprine arm: 10 minor relapses, 25 major relapses, and 1 death. Major RFS survival rates were 71.9% versus 49.4%, respectively (P = .003).
“There was an absolute difference of 12 months favoring rituximab for RFS,” Dr. Terrier said.
Serious infections were numerically increased in the rituximab-treated group: 30 compared with 20 in the azathioprine group. Cardiovascular event rates were similar for both groups.
Q-TWIST analysis found significantly longer quality-adjusted time without progression or toxicity in the rituximab arm (P less than .001). The cumulative dose of steroids was comparable between treatment groups.
Six cancers were found in the azathioprine arm (including four skin cancers), compared with two in the rituximab arm.
In the rituximab group, PR3 ANCA positivity or ANCA persistence 12 months after starting rituximab maintenance therapy were associated with higher major relapse rates.
“Combining these two factors allows discerning low relapse rate. Patients negative for ANCA and for PR3 ANCA were at low risk,” Dr. Terrier told the audience. “ANCA monitoring seems to be relevant to guide treatment duration.”
Best rituximab regimen?
A separate study presented at a poster session compared the systemic regimen used in MAINRITSAN (as a control group) versus an experimental regimen of fixed 500-mg rituximab infusions on day 0 post randomization and then every 3 months until month 18, based on ANCA status/titer and/or circulating CD19 B-cell reappearance. The open-label, randomized study included 163 patients with granulomatosis with polyangiitis or microscopic polyangiitis in complete remission after induction therapy with glucocorticoids and cyclophosphamide or rituximab or methotrexate.
At 28 months, the relapse rate was 8% in the control arm and 14% in the experimental arm, a difference that was not statistically significant.
“We found no difference in the primary endpoint of relapse between the two regimens, but the experimental arm received fewer infusions. From this study, we cannot make a strong recommendation for the experimental regimen, but we think it is better, because there is less cumulative exposure to rituximab,” stated lead author Pierre Charles, MD, also of Cochin Hospital.
Both studies were funded by Hoffman-LaRoche. Dr. Terrier and Dr. Charles disclosed financial support from Hoffman-LaRoche.
Over the long term, patients initially randomized to rituximab maintenance therapy in the initial phase of the MAINRITSAN trial continued to be more likely to remain in remission than were those who had been randomized to azathioprine. The maintenance therapy data for rituximab look better than for azathioprine.
In clinical practice, individualizing the timing of repeat rituximab may be favorable for remission maintenance rather than having the same fixed dose for all comers, as was used in the MAINRITSAN trial.
By tailoring therapy, it may be possible to use less medication. If patients’ B cells remain depleted and ANCA is stable, flare is unlikely in the next 3 months, but if B cells are reconstituting, particularly in concert with a rising ANCA, I would generally administer a “remission maintenance” dose of rituximab, particularly in an individual who has had relapsing disease in the past. Perhaps patients could use less cumulative rituximab if these parameters are used to make treatment decisions. The poster presentation by Pierre Charles, MD, suggested that this approach indeed is feasible.
Robert F. Spiera, MD, is director of the Scleroderma, Vasculitis, & Myositis Center at the Hospital for Special Surgery, New York. He is also professor of clinical medicine at Cornell University, New York. He made these comments in an interview. Dr. Spiera has received research funding and consulting fees from Roche/Genentech, which markets rituximab.
Over the long term, patients initially randomized to rituximab maintenance therapy in the initial phase of the MAINRITSAN trial continued to be more likely to remain in remission than were those who had been randomized to azathioprine. The maintenance therapy data for rituximab look better than for azathioprine.
In clinical practice, individualizing the timing of repeat rituximab may be favorable for remission maintenance rather than having the same fixed dose for all comers, as was used in the MAINRITSAN trial.
By tailoring therapy, it may be possible to use less medication. If patients’ B cells remain depleted and ANCA is stable, flare is unlikely in the next 3 months, but if B cells are reconstituting, particularly in concert with a rising ANCA, I would generally administer a “remission maintenance” dose of rituximab, particularly in an individual who has had relapsing disease in the past. Perhaps patients could use less cumulative rituximab if these parameters are used to make treatment decisions. The poster presentation by Pierre Charles, MD, suggested that this approach indeed is feasible.
Robert F. Spiera, MD, is director of the Scleroderma, Vasculitis, & Myositis Center at the Hospital for Special Surgery, New York. He is also professor of clinical medicine at Cornell University, New York. He made these comments in an interview. Dr. Spiera has received research funding and consulting fees from Roche/Genentech, which markets rituximab.
Over the long term, patients initially randomized to rituximab maintenance therapy in the initial phase of the MAINRITSAN trial continued to be more likely to remain in remission than were those who had been randomized to azathioprine. The maintenance therapy data for rituximab look better than for azathioprine.
In clinical practice, individualizing the timing of repeat rituximab may be favorable for remission maintenance rather than having the same fixed dose for all comers, as was used in the MAINRITSAN trial.
By tailoring therapy, it may be possible to use less medication. If patients’ B cells remain depleted and ANCA is stable, flare is unlikely in the next 3 months, but if B cells are reconstituting, particularly in concert with a rising ANCA, I would generally administer a “remission maintenance” dose of rituximab, particularly in an individual who has had relapsing disease in the past. Perhaps patients could use less cumulative rituximab if these parameters are used to make treatment decisions. The poster presentation by Pierre Charles, MD, suggested that this approach indeed is feasible.
Robert F. Spiera, MD, is director of the Scleroderma, Vasculitis, & Myositis Center at the Hospital for Special Surgery, New York. He is also professor of clinical medicine at Cornell University, New York. He made these comments in an interview. Dr. Spiera has received research funding and consulting fees from Roche/Genentech, which markets rituximab.
WASHINGTON – Rituximab was superior to azathioprine as maintenance therapy for antineutrophil cytoplasmic antibody–associated vasculitis over long-term follow-up of the MAINRITSAN trial. At 60 months, rituximab significantly improved overall survival and relapse-free survival, compared with azathioprine.
At 60 months, overall survival (OS) rates were 100% for rituximab (Rituxan) versus 93% for azathioprine (P = .045), and relapse-free survival (RFS) rates were 57.9% versus 37.3%, respectively (P = .012).
These long-term maintenance results build on the primary results of MAINRITSAN that were previously published in 2014 (N Engl J Med. 2014;Nov 6;371[19]:1771-80). The primary results showed the superiority of rituximab maintenance therapy versus the then-gold standard azathioprine in maintaining antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis (AAV) remission at 28 months following induction therapy with a cyclophosphamide/glucocorticoid regimen.
“Following publication of the primary results, some uncertainties remained as to the duration of remission on rituximab. There is a need for therapy that can prevent relapse over the longer term,” lead author Benjamin Terrier, MD, of the National Referral Center for Rare Systemic Autoimmune Diseases at Cochin Hospital and Paris Descartes University in France, said in his presentation of the long-term follow-up data at the annual meeting of the American College of Rheumatology.
The follow-up results indicate that “over the long term, despite late relapses after the 28-month initial follow-up period, maintenance therapy with rituximab remained significantly superior to azathioprine to maintain remission at 60 months and was associated with better survival,” Dr. Terrier said.
The study included 115 newly diagnosed or relapsing patients with AAV (granulomatosis with polyangiitis, microscopic polyangiitis, or eosinophilic granulomatosis with polyangiitis). Of these, 80% were newly diagnosed. After achieving remission on induction therapy, patients were randomized to rituximab infusion 500 mg on day 1, day 15, and 5.5 months later, then every 6 months for 18 months or to azathioprine for 22 months.
The investigators collected data prospectively on major and minor relapses and adverse events, using a Q-TWIST (Quality Adjusted Time Without Symptoms and Toxicity) analysis to show the trade-off between toxicity and disease activity.
For all relapses, major and minor, rituximab maintained superiority over azathioprine at 60 months. There were 24 events in the rituximab arm: 11 minor relapses and 13 major relapses. There were 36 events in the azathioprine arm: 10 minor relapses, 25 major relapses, and 1 death. Major RFS survival rates were 71.9% versus 49.4%, respectively (P = .003).
“There was an absolute difference of 12 months favoring rituximab for RFS,” Dr. Terrier said.
Serious infections were numerically increased in the rituximab-treated group: 30 compared with 20 in the azathioprine group. Cardiovascular event rates were similar for both groups.
Q-TWIST analysis found significantly longer quality-adjusted time without progression or toxicity in the rituximab arm (P less than .001). The cumulative dose of steroids was comparable between treatment groups.
Six cancers were found in the azathioprine arm (including four skin cancers), compared with two in the rituximab arm.
In the rituximab group, PR3 ANCA positivity or ANCA persistence 12 months after starting rituximab maintenance therapy were associated with higher major relapse rates.
“Combining these two factors allows discerning low relapse rate. Patients negative for ANCA and for PR3 ANCA were at low risk,” Dr. Terrier told the audience. “ANCA monitoring seems to be relevant to guide treatment duration.”
Best rituximab regimen?
A separate study presented at a poster session compared the systemic regimen used in MAINRITSAN (as a control group) versus an experimental regimen of fixed 500-mg rituximab infusions on day 0 post randomization and then every 3 months until month 18, based on ANCA status/titer and/or circulating CD19 B-cell reappearance. The open-label, randomized study included 163 patients with granulomatosis with polyangiitis or microscopic polyangiitis in complete remission after induction therapy with glucocorticoids and cyclophosphamide or rituximab or methotrexate.
At 28 months, the relapse rate was 8% in the control arm and 14% in the experimental arm, a difference that was not statistically significant.
“We found no difference in the primary endpoint of relapse between the two regimens, but the experimental arm received fewer infusions. From this study, we cannot make a strong recommendation for the experimental regimen, but we think it is better, because there is less cumulative exposure to rituximab,” stated lead author Pierre Charles, MD, also of Cochin Hospital.
Both studies were funded by Hoffman-LaRoche. Dr. Terrier and Dr. Charles disclosed financial support from Hoffman-LaRoche.
WASHINGTON – Rituximab was superior to azathioprine as maintenance therapy for antineutrophil cytoplasmic antibody–associated vasculitis over long-term follow-up of the MAINRITSAN trial. At 60 months, rituximab significantly improved overall survival and relapse-free survival, compared with azathioprine.
At 60 months, overall survival (OS) rates were 100% for rituximab (Rituxan) versus 93% for azathioprine (P = .045), and relapse-free survival (RFS) rates were 57.9% versus 37.3%, respectively (P = .012).
These long-term maintenance results build on the primary results of MAINRITSAN that were previously published in 2014 (N Engl J Med. 2014;Nov 6;371[19]:1771-80). The primary results showed the superiority of rituximab maintenance therapy versus the then-gold standard azathioprine in maintaining antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis (AAV) remission at 28 months following induction therapy with a cyclophosphamide/glucocorticoid regimen.
“Following publication of the primary results, some uncertainties remained as to the duration of remission on rituximab. There is a need for therapy that can prevent relapse over the longer term,” lead author Benjamin Terrier, MD, of the National Referral Center for Rare Systemic Autoimmune Diseases at Cochin Hospital and Paris Descartes University in France, said in his presentation of the long-term follow-up data at the annual meeting of the American College of Rheumatology.
The follow-up results indicate that “over the long term, despite late relapses after the 28-month initial follow-up period, maintenance therapy with rituximab remained significantly superior to azathioprine to maintain remission at 60 months and was associated with better survival,” Dr. Terrier said.
The study included 115 newly diagnosed or relapsing patients with AAV (granulomatosis with polyangiitis, microscopic polyangiitis, or eosinophilic granulomatosis with polyangiitis). Of these, 80% were newly diagnosed. After achieving remission on induction therapy, patients were randomized to rituximab infusion 500 mg on day 1, day 15, and 5.5 months later, then every 6 months for 18 months or to azathioprine for 22 months.
The investigators collected data prospectively on major and minor relapses and adverse events, using a Q-TWIST (Quality Adjusted Time Without Symptoms and Toxicity) analysis to show the trade-off between toxicity and disease activity.
For all relapses, major and minor, rituximab maintained superiority over azathioprine at 60 months. There were 24 events in the rituximab arm: 11 minor relapses and 13 major relapses. There were 36 events in the azathioprine arm: 10 minor relapses, 25 major relapses, and 1 death. Major RFS survival rates were 71.9% versus 49.4%, respectively (P = .003).
“There was an absolute difference of 12 months favoring rituximab for RFS,” Dr. Terrier said.
Serious infections were numerically increased in the rituximab-treated group: 30 compared with 20 in the azathioprine group. Cardiovascular event rates were similar for both groups.
Q-TWIST analysis found significantly longer quality-adjusted time without progression or toxicity in the rituximab arm (P less than .001). The cumulative dose of steroids was comparable between treatment groups.
Six cancers were found in the azathioprine arm (including four skin cancers), compared with two in the rituximab arm.
In the rituximab group, PR3 ANCA positivity or ANCA persistence 12 months after starting rituximab maintenance therapy were associated with higher major relapse rates.
“Combining these two factors allows discerning low relapse rate. Patients negative for ANCA and for PR3 ANCA were at low risk,” Dr. Terrier told the audience. “ANCA monitoring seems to be relevant to guide treatment duration.”
Best rituximab regimen?
A separate study presented at a poster session compared the systemic regimen used in MAINRITSAN (as a control group) versus an experimental regimen of fixed 500-mg rituximab infusions on day 0 post randomization and then every 3 months until month 18, based on ANCA status/titer and/or circulating CD19 B-cell reappearance. The open-label, randomized study included 163 patients with granulomatosis with polyangiitis or microscopic polyangiitis in complete remission after induction therapy with glucocorticoids and cyclophosphamide or rituximab or methotrexate.
At 28 months, the relapse rate was 8% in the control arm and 14% in the experimental arm, a difference that was not statistically significant.
“We found no difference in the primary endpoint of relapse between the two regimens, but the experimental arm received fewer infusions. From this study, we cannot make a strong recommendation for the experimental regimen, but we think it is better, because there is less cumulative exposure to rituximab,” stated lead author Pierre Charles, MD, also of Cochin Hospital.
Both studies were funded by Hoffman-LaRoche. Dr. Terrier and Dr. Charles disclosed financial support from Hoffman-LaRoche.
AT THE ACR ANNUAL MEETING
Key clinical point:
Major finding: At 60 months, overall survival rates were 100% for rituximab versus 93% azathioprine (P = .045) and relapse-free survival rates were 57.9% versus 37.3%, respectively (P = .012).
Data source: 60-month follow-up of a randomized, controlled trial of 115 patients.
Disclosures: Both studies were funded by Hoffman-LaRoche. Dr. Terrier and Dr. Charles each disclosed financial support from Hoffman-LaRoche.
Allogeneic stem cells show promise for treating nonischemic dilated cardiomyopathy
NEW ORLEANS – Allogeneic stem cells appear to be a safe treatment option for nonischemic dilated cardiomyopathy and show somewhat greater efficacy than autologous stem cells, according to the results of the randomized POSEIDON-DCM trial.
“Nonischemic dilated cardiomyopathy is an incurable condition with significant genetic and immunologic underpinnings,” noted lead investigator Joshua M. Hare, MD, director of the Interdisciplinary Stem Cell Institute and professor of medicine at the University of Miami.
The phase I/II trial undertook a head-to-head comparison of allogeneic and autologous bone marrow–derived mesenchymal stem cells in 37 patients with nonischemic dilated cardiomyopathy.
Results presented at the American Heart Association scientific sessions and simultaneously published (J Am Coll Cardiol. 2016. doi: 10.1016/j.jacc.2016.11.009) showed that none of the patients in either group experienced a 30-day treatment-emergent serious adverse event, the trial’s primary endpoint.
The allogeneic group had a greater shift to a lesser inflammatory immune profile, and, at 12 months, a lower rate of major adverse cardiac events and more improvement in walk test distance. Additionally, half of patients in the allogeneic group no longer met the ejection fraction cutoff typically used to define dilated cardiomyopathy, compared with only about one-fifth of those in the autologous group.
“Immunomodulation may contribute to the efficacy of allogeneic human mesenchymal stem cells in nonischemic dilated cardiomyopathy, as we have shown suppression of immune activation to a greater degree with the allo versus auto cells,” Dr. Hare said.
“We argue that these data support the use of allogeneic mesenchymal stem cell therapy in future pivotal placebo-controlled clinical trials for this patient population, an important patient population with significant unmet need.”
Trial details
The patients enrolled in POSEIDON-DCM had left ventricular dysfunction due to nonischemic dilated cardiomyopathy and were randomized evenly to allogeneic or autologous stem cell therapy. Stem cells were delivered by transendocardial injection into 10 left ventricular sites using a catheter.
In the first 30 days after treatment, there were no treatment-emergent serious adverse events, defined as death, nonfatal myocardial infarction, stroke, hospitalization for worsening heart failure, cardiac perforation, pericardial tamponade, or sustained ventricular arrhythmias. “The 30-day safety and tolerability was excellent in both groups receiving either allogeneic or autologous therapy,” Dr. Hare said.
At 12 months, the allogeneic group had lower rates than the autologous group of major adverse cardiac events (20.3% vs. 57.1%, P = .0186) and all-cause rehospitalizations (28.2% vs. 70.0%, P = .0447).
In terms of efficacy, ejection fraction at 12 months had improved by a significant 8.0 Units in the allogeneic group and a nonsignificant 5.4 Units in the autologous group (P not significant for difference between groups). Roughly half of patients in the allogeneic group had achieved an ejection fraction of greater than 40%, compared with only two patients in the autologous group. “This is meaningful because the clinical definition of dilated cardiomyopathy typically uses an ejection fraction cutoff of 40%,” he noted.
The 6-minute walk test distance increased by a significant 37.0 m for the allogeneic group and by a nonsignificant 7.3 m for the autologous group (P = .0168 for difference between groups). Scores on the Minnesota Living With Heart Failure Questionnaire fell significantly in the former group and nonsignificantly in the latter group (P not significant for difference between groups).
Patients in the allogeneic group were more likely to have an improvement from baseline in New York Heart Association class (66.7% vs. 27.3%, P = .0527).
“An issue of concern in this field has been the formation of ectopic tissue with mesenchymal stem cells, so patients received whole-body CT scanning over 12 months,” Dr. Hare reported. “There was no ectopic tissue formation or tumor formation in any patient.”
In terms of biologic endpoints, two measures of endothelial function known to be suppressed in the setting of circulatory failure – endothelial progenitor cell colony-forming units and flow-mediated vasodilation – had increased significantly at 3 months in the allogeneic group only. Tumor necrosis factor–alpha levels fell by roughly 70% with allogeneic therapy versus 50% with autologous therapy (P = .05).
Both groups had a lessening of the immunosuppression that is common in heart failure, but benefit in several markers, such as the percentage of switched memory B cells, was greater with the allogeneic therapy. Additionally, there was a trend toward greater reduction of early T-cell activation in the allogeneic group.
“Of importance in the field of allogeneic cell therapy is [whether] the allogeneic cells mount a panel-reactive antigen [PRA],” commented Dr. Hare, who disclosed that he has an ownership interest in and is a consultant or advisory board member for Vestion.
Results showed that one patient in the allogeneic group developed a high-risk PRA, compared with none in the autologous group. Another four patients in the former group developed a moderate-risk PRA, compared with one in the latter group (P less than or equal to .05).
NEW ORLEANS – Allogeneic stem cells appear to be a safe treatment option for nonischemic dilated cardiomyopathy and show somewhat greater efficacy than autologous stem cells, according to the results of the randomized POSEIDON-DCM trial.
“Nonischemic dilated cardiomyopathy is an incurable condition with significant genetic and immunologic underpinnings,” noted lead investigator Joshua M. Hare, MD, director of the Interdisciplinary Stem Cell Institute and professor of medicine at the University of Miami.
The phase I/II trial undertook a head-to-head comparison of allogeneic and autologous bone marrow–derived mesenchymal stem cells in 37 patients with nonischemic dilated cardiomyopathy.
Results presented at the American Heart Association scientific sessions and simultaneously published (J Am Coll Cardiol. 2016. doi: 10.1016/j.jacc.2016.11.009) showed that none of the patients in either group experienced a 30-day treatment-emergent serious adverse event, the trial’s primary endpoint.
The allogeneic group had a greater shift to a lesser inflammatory immune profile, and, at 12 months, a lower rate of major adverse cardiac events and more improvement in walk test distance. Additionally, half of patients in the allogeneic group no longer met the ejection fraction cutoff typically used to define dilated cardiomyopathy, compared with only about one-fifth of those in the autologous group.
“Immunomodulation may contribute to the efficacy of allogeneic human mesenchymal stem cells in nonischemic dilated cardiomyopathy, as we have shown suppression of immune activation to a greater degree with the allo versus auto cells,” Dr. Hare said.
“We argue that these data support the use of allogeneic mesenchymal stem cell therapy in future pivotal placebo-controlled clinical trials for this patient population, an important patient population with significant unmet need.”
Trial details
The patients enrolled in POSEIDON-DCM had left ventricular dysfunction due to nonischemic dilated cardiomyopathy and were randomized evenly to allogeneic or autologous stem cell therapy. Stem cells were delivered by transendocardial injection into 10 left ventricular sites using a catheter.
In the first 30 days after treatment, there were no treatment-emergent serious adverse events, defined as death, nonfatal myocardial infarction, stroke, hospitalization for worsening heart failure, cardiac perforation, pericardial tamponade, or sustained ventricular arrhythmias. “The 30-day safety and tolerability was excellent in both groups receiving either allogeneic or autologous therapy,” Dr. Hare said.
At 12 months, the allogeneic group had lower rates than the autologous group of major adverse cardiac events (20.3% vs. 57.1%, P = .0186) and all-cause rehospitalizations (28.2% vs. 70.0%, P = .0447).
In terms of efficacy, ejection fraction at 12 months had improved by a significant 8.0 Units in the allogeneic group and a nonsignificant 5.4 Units in the autologous group (P not significant for difference between groups). Roughly half of patients in the allogeneic group had achieved an ejection fraction of greater than 40%, compared with only two patients in the autologous group. “This is meaningful because the clinical definition of dilated cardiomyopathy typically uses an ejection fraction cutoff of 40%,” he noted.
The 6-minute walk test distance increased by a significant 37.0 m for the allogeneic group and by a nonsignificant 7.3 m for the autologous group (P = .0168 for difference between groups). Scores on the Minnesota Living With Heart Failure Questionnaire fell significantly in the former group and nonsignificantly in the latter group (P not significant for difference between groups).
Patients in the allogeneic group were more likely to have an improvement from baseline in New York Heart Association class (66.7% vs. 27.3%, P = .0527).
“An issue of concern in this field has been the formation of ectopic tissue with mesenchymal stem cells, so patients received whole-body CT scanning over 12 months,” Dr. Hare reported. “There was no ectopic tissue formation or tumor formation in any patient.”
In terms of biologic endpoints, two measures of endothelial function known to be suppressed in the setting of circulatory failure – endothelial progenitor cell colony-forming units and flow-mediated vasodilation – had increased significantly at 3 months in the allogeneic group only. Tumor necrosis factor–alpha levels fell by roughly 70% with allogeneic therapy versus 50% with autologous therapy (P = .05).
Both groups had a lessening of the immunosuppression that is common in heart failure, but benefit in several markers, such as the percentage of switched memory B cells, was greater with the allogeneic therapy. Additionally, there was a trend toward greater reduction of early T-cell activation in the allogeneic group.
“Of importance in the field of allogeneic cell therapy is [whether] the allogeneic cells mount a panel-reactive antigen [PRA],” commented Dr. Hare, who disclosed that he has an ownership interest in and is a consultant or advisory board member for Vestion.
Results showed that one patient in the allogeneic group developed a high-risk PRA, compared with none in the autologous group. Another four patients in the former group developed a moderate-risk PRA, compared with one in the latter group (P less than or equal to .05).
NEW ORLEANS – Allogeneic stem cells appear to be a safe treatment option for nonischemic dilated cardiomyopathy and show somewhat greater efficacy than autologous stem cells, according to the results of the randomized POSEIDON-DCM trial.
“Nonischemic dilated cardiomyopathy is an incurable condition with significant genetic and immunologic underpinnings,” noted lead investigator Joshua M. Hare, MD, director of the Interdisciplinary Stem Cell Institute and professor of medicine at the University of Miami.
The phase I/II trial undertook a head-to-head comparison of allogeneic and autologous bone marrow–derived mesenchymal stem cells in 37 patients with nonischemic dilated cardiomyopathy.
Results presented at the American Heart Association scientific sessions and simultaneously published (J Am Coll Cardiol. 2016. doi: 10.1016/j.jacc.2016.11.009) showed that none of the patients in either group experienced a 30-day treatment-emergent serious adverse event, the trial’s primary endpoint.
The allogeneic group had a greater shift to a lesser inflammatory immune profile, and, at 12 months, a lower rate of major adverse cardiac events and more improvement in walk test distance. Additionally, half of patients in the allogeneic group no longer met the ejection fraction cutoff typically used to define dilated cardiomyopathy, compared with only about one-fifth of those in the autologous group.
“Immunomodulation may contribute to the efficacy of allogeneic human mesenchymal stem cells in nonischemic dilated cardiomyopathy, as we have shown suppression of immune activation to a greater degree with the allo versus auto cells,” Dr. Hare said.
“We argue that these data support the use of allogeneic mesenchymal stem cell therapy in future pivotal placebo-controlled clinical trials for this patient population, an important patient population with significant unmet need.”
Trial details
The patients enrolled in POSEIDON-DCM had left ventricular dysfunction due to nonischemic dilated cardiomyopathy and were randomized evenly to allogeneic or autologous stem cell therapy. Stem cells were delivered by transendocardial injection into 10 left ventricular sites using a catheter.
In the first 30 days after treatment, there were no treatment-emergent serious adverse events, defined as death, nonfatal myocardial infarction, stroke, hospitalization for worsening heart failure, cardiac perforation, pericardial tamponade, or sustained ventricular arrhythmias. “The 30-day safety and tolerability was excellent in both groups receiving either allogeneic or autologous therapy,” Dr. Hare said.
At 12 months, the allogeneic group had lower rates than the autologous group of major adverse cardiac events (20.3% vs. 57.1%, P = .0186) and all-cause rehospitalizations (28.2% vs. 70.0%, P = .0447).
In terms of efficacy, ejection fraction at 12 months had improved by a significant 8.0 Units in the allogeneic group and a nonsignificant 5.4 Units in the autologous group (P not significant for difference between groups). Roughly half of patients in the allogeneic group had achieved an ejection fraction of greater than 40%, compared with only two patients in the autologous group. “This is meaningful because the clinical definition of dilated cardiomyopathy typically uses an ejection fraction cutoff of 40%,” he noted.
The 6-minute walk test distance increased by a significant 37.0 m for the allogeneic group and by a nonsignificant 7.3 m for the autologous group (P = .0168 for difference between groups). Scores on the Minnesota Living With Heart Failure Questionnaire fell significantly in the former group and nonsignificantly in the latter group (P not significant for difference between groups).
Patients in the allogeneic group were more likely to have an improvement from baseline in New York Heart Association class (66.7% vs. 27.3%, P = .0527).
“An issue of concern in this field has been the formation of ectopic tissue with mesenchymal stem cells, so patients received whole-body CT scanning over 12 months,” Dr. Hare reported. “There was no ectopic tissue formation or tumor formation in any patient.”
In terms of biologic endpoints, two measures of endothelial function known to be suppressed in the setting of circulatory failure – endothelial progenitor cell colony-forming units and flow-mediated vasodilation – had increased significantly at 3 months in the allogeneic group only. Tumor necrosis factor–alpha levels fell by roughly 70% with allogeneic therapy versus 50% with autologous therapy (P = .05).
Both groups had a lessening of the immunosuppression that is common in heart failure, but benefit in several markers, such as the percentage of switched memory B cells, was greater with the allogeneic therapy. Additionally, there was a trend toward greater reduction of early T-cell activation in the allogeneic group.
“Of importance in the field of allogeneic cell therapy is [whether] the allogeneic cells mount a panel-reactive antigen [PRA],” commented Dr. Hare, who disclosed that he has an ownership interest in and is a consultant or advisory board member for Vestion.
Results showed that one patient in the allogeneic group developed a high-risk PRA, compared with none in the autologous group. Another four patients in the former group developed a moderate-risk PRA, compared with one in the latter group (P less than or equal to .05).
AT THE AHA SCIENTIFIC SESSIONS
Key clinical point:
Major finding: At 12 months, the allogeneic group had lower rates than the autologous group of major adverse cardiac events (20.3% vs. 57.1%, P = .0186) and all-cause rehospitalizations (28.2% vs. 70.0%, P = .0447).
Data source: A randomized phase I/II trial among 37 patients with nonischemic dilated cardiomyopathy (POSEIDON-DCM trial).
Disclosures: Dr. Hare disclosed that he has an ownership interest in and is a consultant or advisory board member for Vestion.
Targeted therapies predicted to blow out costs for CLL
The lifetime cost of treating chronic lymphocytic leukemia is forecast to rise precipitously for patients diagnosed today, as oral targeted therapies take over as the first-line treatment option, according to a study published November 21 in the Journal of Clinical Oncology.
The conclusion is based on economic models that also indicated the annual cost in the United States of managing chronic lymphocytic leukemia (CLL) will increase from its current level of $0.74 billion to $5.13 billion by 2025 (J Clin Oncol. 2016 Nov 21. doi: 10.1200/JCO.2016.68.2856).
While the majority of patients with CLL are covered by Medicare in the United States, they still currently pay $9,200 in out-of-pocket costs for oral agents. This figure is forecast to increase to $57,000 for those who start treatment in 2016 due to the increased costs of oral targeted therapeutics.
“Such an economic impact could result in financial toxicity, limited access, and lower adherence to the oral therapies, which may undermine their clinical effectiveness,” Qiushi Chen, from the Georgia Institute of Technology, Atlanta, and his coauthors wrote. They called for a more sustainable pricing strategy for oral targeted therapies, rather than have clinicians be forced to choose less effective but more affordable management strategies.
The researchers developed a microsimulation model of CLL, simulating the dynamics of the patient population under given management strategies from 2011-2025.
Around 130,000 patients live with CLL in the United States and around 15,000 new cases are diagnosed each year. By 2025, the authors forecast that 199,000 people will be living with the disease; a 55% increase that is both the result of new diagnoses and of improved survival with new oral targeted therapies.
Chemoimmunotherapy regimens – such as fludarabine, cyclophosphamide, and rituximab – have long been the standard first-line approach to CLL. But in recent years, new oral targeted agents such as ibrutinib and idelalisib have significantly improved progression-free survival and overall survival in CLL.
Ibrutinib is approved for first-line management of CLL, idelalisib is approved in combination with rituximab for patients with relapsed/refractory chronic lymphocytic leukemia, and venetoclax is approved for patients with relapsed chronic lymphocytic leukemia with del(17p).
“Both ibrutinib and idelalisib are priced at approximately $130,000 per [CLL patient per] year and are recommended until patients have progressive disease or significant toxicities,” the authors wrote. “In contrast, the costs for chemoimmunotherapy-based treatments range from $60,000 to $100,000 for a finite duration, that is, a typical six-cycle course that lasts for approximately 6 months.”
The higher costs will add up to additional annual spending of $29 billion to 2025, compared with around $1.12 billion annually for chemoimmunotherapy alone.
“Compared with the CIT scenario, the oral targeted therapy scenario resulted in an increase of 107,000 person–quality-adjusted life-years (149,000 person–life years), with additional discounted costs of $20.2 billion,” the authors reported.
The annual cost of cancer care in the United States is increasing across the board, from $143 billion in 2010 to $180 billion in 2020, but the cost of care for CLL is increasing more significantly than for other cancers. For example, breast and prostate cancers are forecast to have a 24%-38% increase in annual cost by 2020, while CLL is predicted to increase by 500%.
Seven authors declared research funding, honoraria and consultancy funding from a range of pharmaceutical companies including those involved in the manufacture of therapies for chronic lymphocytic leukemia. Two authors had no conflicts to declare.
The lifetime cost of treating chronic lymphocytic leukemia is forecast to rise precipitously for patients diagnosed today, as oral targeted therapies take over as the first-line treatment option, according to a study published November 21 in the Journal of Clinical Oncology.
The conclusion is based on economic models that also indicated the annual cost in the United States of managing chronic lymphocytic leukemia (CLL) will increase from its current level of $0.74 billion to $5.13 billion by 2025 (J Clin Oncol. 2016 Nov 21. doi: 10.1200/JCO.2016.68.2856).
While the majority of patients with CLL are covered by Medicare in the United States, they still currently pay $9,200 in out-of-pocket costs for oral agents. This figure is forecast to increase to $57,000 for those who start treatment in 2016 due to the increased costs of oral targeted therapeutics.
“Such an economic impact could result in financial toxicity, limited access, and lower adherence to the oral therapies, which may undermine their clinical effectiveness,” Qiushi Chen, from the Georgia Institute of Technology, Atlanta, and his coauthors wrote. They called for a more sustainable pricing strategy for oral targeted therapies, rather than have clinicians be forced to choose less effective but more affordable management strategies.
The researchers developed a microsimulation model of CLL, simulating the dynamics of the patient population under given management strategies from 2011-2025.
Around 130,000 patients live with CLL in the United States and around 15,000 new cases are diagnosed each year. By 2025, the authors forecast that 199,000 people will be living with the disease; a 55% increase that is both the result of new diagnoses and of improved survival with new oral targeted therapies.
Chemoimmunotherapy regimens – such as fludarabine, cyclophosphamide, and rituximab – have long been the standard first-line approach to CLL. But in recent years, new oral targeted agents such as ibrutinib and idelalisib have significantly improved progression-free survival and overall survival in CLL.
Ibrutinib is approved for first-line management of CLL, idelalisib is approved in combination with rituximab for patients with relapsed/refractory chronic lymphocytic leukemia, and venetoclax is approved for patients with relapsed chronic lymphocytic leukemia with del(17p).
“Both ibrutinib and idelalisib are priced at approximately $130,000 per [CLL patient per] year and are recommended until patients have progressive disease or significant toxicities,” the authors wrote. “In contrast, the costs for chemoimmunotherapy-based treatments range from $60,000 to $100,000 for a finite duration, that is, a typical six-cycle course that lasts for approximately 6 months.”
The higher costs will add up to additional annual spending of $29 billion to 2025, compared with around $1.12 billion annually for chemoimmunotherapy alone.
“Compared with the CIT scenario, the oral targeted therapy scenario resulted in an increase of 107,000 person–quality-adjusted life-years (149,000 person–life years), with additional discounted costs of $20.2 billion,” the authors reported.
The annual cost of cancer care in the United States is increasing across the board, from $143 billion in 2010 to $180 billion in 2020, but the cost of care for CLL is increasing more significantly than for other cancers. For example, breast and prostate cancers are forecast to have a 24%-38% increase in annual cost by 2020, while CLL is predicted to increase by 500%.
Seven authors declared research funding, honoraria and consultancy funding from a range of pharmaceutical companies including those involved in the manufacture of therapies for chronic lymphocytic leukemia. Two authors had no conflicts to declare.
The lifetime cost of treating chronic lymphocytic leukemia is forecast to rise precipitously for patients diagnosed today, as oral targeted therapies take over as the first-line treatment option, according to a study published November 21 in the Journal of Clinical Oncology.
The conclusion is based on economic models that also indicated the annual cost in the United States of managing chronic lymphocytic leukemia (CLL) will increase from its current level of $0.74 billion to $5.13 billion by 2025 (J Clin Oncol. 2016 Nov 21. doi: 10.1200/JCO.2016.68.2856).
While the majority of patients with CLL are covered by Medicare in the United States, they still currently pay $9,200 in out-of-pocket costs for oral agents. This figure is forecast to increase to $57,000 for those who start treatment in 2016 due to the increased costs of oral targeted therapeutics.
“Such an economic impact could result in financial toxicity, limited access, and lower adherence to the oral therapies, which may undermine their clinical effectiveness,” Qiushi Chen, from the Georgia Institute of Technology, Atlanta, and his coauthors wrote. They called for a more sustainable pricing strategy for oral targeted therapies, rather than have clinicians be forced to choose less effective but more affordable management strategies.
The researchers developed a microsimulation model of CLL, simulating the dynamics of the patient population under given management strategies from 2011-2025.
Around 130,000 patients live with CLL in the United States and around 15,000 new cases are diagnosed each year. By 2025, the authors forecast that 199,000 people will be living with the disease; a 55% increase that is both the result of new diagnoses and of improved survival with new oral targeted therapies.
Chemoimmunotherapy regimens – such as fludarabine, cyclophosphamide, and rituximab – have long been the standard first-line approach to CLL. But in recent years, new oral targeted agents such as ibrutinib and idelalisib have significantly improved progression-free survival and overall survival in CLL.
Ibrutinib is approved for first-line management of CLL, idelalisib is approved in combination with rituximab for patients with relapsed/refractory chronic lymphocytic leukemia, and venetoclax is approved for patients with relapsed chronic lymphocytic leukemia with del(17p).
“Both ibrutinib and idelalisib are priced at approximately $130,000 per [CLL patient per] year and are recommended until patients have progressive disease or significant toxicities,” the authors wrote. “In contrast, the costs for chemoimmunotherapy-based treatments range from $60,000 to $100,000 for a finite duration, that is, a typical six-cycle course that lasts for approximately 6 months.”
The higher costs will add up to additional annual spending of $29 billion to 2025, compared with around $1.12 billion annually for chemoimmunotherapy alone.
“Compared with the CIT scenario, the oral targeted therapy scenario resulted in an increase of 107,000 person–quality-adjusted life-years (149,000 person–life years), with additional discounted costs of $20.2 billion,” the authors reported.
The annual cost of cancer care in the United States is increasing across the board, from $143 billion in 2010 to $180 billion in 2020, but the cost of care for CLL is increasing more significantly than for other cancers. For example, breast and prostate cancers are forecast to have a 24%-38% increase in annual cost by 2020, while CLL is predicted to increase by 500%.
Seven authors declared research funding, honoraria and consultancy funding from a range of pharmaceutical companies including those involved in the manufacture of therapies for chronic lymphocytic leukemia. Two authors had no conflicts to declare.
FROM THE JOURNAL OF CLINICAL ONCOLOGY
Key clinical point: The high cost of oral therapies for chronic lymphocytic leukemia could result in financial toxicity, limited access, and lower adherence, which may undermine their clinical effectiveness.
Major finding: The cost of treating chronic lymphocytic leukemia is forecast to increase from its current level of $0.74 billion to $5.13 billion by 2025.
Data source: Microsimulation model of chronic lymphocytic leukemia treatment from 2010-2025.
Disclosures: Seven authors declared research funding, honoraria, and consultancy funding from a range of pharmaceutical companies including those involved in the manufacture of therapies for chronic lymphocytic leukemia. Two authors had no conflicts to declare.