User login
New approaches needed for food allergies in minority children
ORLANDO – compared with their white counterparts, an expert said.
These ethnic groups have higher odds of food sensitization compared with whites, and an analysis of the U.S. National Mortality Database found a higher rate of food-related anaphylaxis that turned fatal more often among African-Americans than among whites, Mahboobeh Mahdavinia, MD, PhD, an allergist and immunologist at Rush University Medical Center, Chicago, said at the joint congress of the American Academy of Asthma, Allergy, and Immunology and the World Asthma Organization.
The “sadder news,” she said, is that the rate of fatal food-related anaphylaxis has been getting worse with time. Rates of fatal food-related anaphylaxis per million significantly increased in African American males from the period of 1999-2001 (.06), compared with 2008-2010 (.21) (P less than .001). Fatal anaphylaxis caused by food was significantly associated with African American race (P less than .001) (J Allergy Clin Immunol. 2014 Dec;134[6]:1318-28.e7).
“There has been a lot of research and increasing awareness about food allergy, but this has certainly not affected minorities, and they’re even dying more from these diseases,” Dr. Mahdavinia said.
Studies also have shown that African-American and Hispanic children have a higher rate of emergency department visits for food allergy, compared with white children. Dr. Mahdavinia said this might be because the severity of their allergies is worse, because they have less access to primary care, they have inferior practices at home to manage the allergies, and that higher asthma rates in these children is likely leading to worse food allergy incidents.
Compared with white children, African American children were significantly more likely to have allergy to wheat, soy, corn, fish, and shellfish (P less than .01). Compared with white children, Hispanic children were significantly more likely to have allergy to corn, fish, and shellfish (P less than .01) (J Allergy Clin Immunol Pract. 2017 Mar-Apr;5[2]:352-7.e1).
Children from low-income backgrounds, she noted, spend less on specialty outpatient care.
The difference in food allergy rates is likely linked, in part, to familial and cultural differences in childrearing, she said. African-American and Hispanic parents tend to introduce solid foods earlier, and breastfeed children at lower rates than those of white families.
Dr. Mahdavinia noted that while more affluent families are able to sidestep allergies by making a simple stop at a high-end grocer to get an allergen-free version of a food, poorer families are less able to buy these more expensive alternatives.
“The higher rate of asthma anaphylaxis observed in these minority children is concerning, especially when it’s considered in the context of the reported higher rate of fatal anaphylaxis associated food allergy in African Americans,” she said. “So there’s a tremendous need for future studies.”
ORLANDO – compared with their white counterparts, an expert said.
These ethnic groups have higher odds of food sensitization compared with whites, and an analysis of the U.S. National Mortality Database found a higher rate of food-related anaphylaxis that turned fatal more often among African-Americans than among whites, Mahboobeh Mahdavinia, MD, PhD, an allergist and immunologist at Rush University Medical Center, Chicago, said at the joint congress of the American Academy of Asthma, Allergy, and Immunology and the World Asthma Organization.
The “sadder news,” she said, is that the rate of fatal food-related anaphylaxis has been getting worse with time. Rates of fatal food-related anaphylaxis per million significantly increased in African American males from the period of 1999-2001 (.06), compared with 2008-2010 (.21) (P less than .001). Fatal anaphylaxis caused by food was significantly associated with African American race (P less than .001) (J Allergy Clin Immunol. 2014 Dec;134[6]:1318-28.e7).
“There has been a lot of research and increasing awareness about food allergy, but this has certainly not affected minorities, and they’re even dying more from these diseases,” Dr. Mahdavinia said.
Studies also have shown that African-American and Hispanic children have a higher rate of emergency department visits for food allergy, compared with white children. Dr. Mahdavinia said this might be because the severity of their allergies is worse, because they have less access to primary care, they have inferior practices at home to manage the allergies, and that higher asthma rates in these children is likely leading to worse food allergy incidents.
Compared with white children, African American children were significantly more likely to have allergy to wheat, soy, corn, fish, and shellfish (P less than .01). Compared with white children, Hispanic children were significantly more likely to have allergy to corn, fish, and shellfish (P less than .01) (J Allergy Clin Immunol Pract. 2017 Mar-Apr;5[2]:352-7.e1).
Children from low-income backgrounds, she noted, spend less on specialty outpatient care.
The difference in food allergy rates is likely linked, in part, to familial and cultural differences in childrearing, she said. African-American and Hispanic parents tend to introduce solid foods earlier, and breastfeed children at lower rates than those of white families.
Dr. Mahdavinia noted that while more affluent families are able to sidestep allergies by making a simple stop at a high-end grocer to get an allergen-free version of a food, poorer families are less able to buy these more expensive alternatives.
“The higher rate of asthma anaphylaxis observed in these minority children is concerning, especially when it’s considered in the context of the reported higher rate of fatal anaphylaxis associated food allergy in African Americans,” she said. “So there’s a tremendous need for future studies.”
ORLANDO – compared with their white counterparts, an expert said.
These ethnic groups have higher odds of food sensitization compared with whites, and an analysis of the U.S. National Mortality Database found a higher rate of food-related anaphylaxis that turned fatal more often among African-Americans than among whites, Mahboobeh Mahdavinia, MD, PhD, an allergist and immunologist at Rush University Medical Center, Chicago, said at the joint congress of the American Academy of Asthma, Allergy, and Immunology and the World Asthma Organization.
The “sadder news,” she said, is that the rate of fatal food-related anaphylaxis has been getting worse with time. Rates of fatal food-related anaphylaxis per million significantly increased in African American males from the period of 1999-2001 (.06), compared with 2008-2010 (.21) (P less than .001). Fatal anaphylaxis caused by food was significantly associated with African American race (P less than .001) (J Allergy Clin Immunol. 2014 Dec;134[6]:1318-28.e7).
“There has been a lot of research and increasing awareness about food allergy, but this has certainly not affected minorities, and they’re even dying more from these diseases,” Dr. Mahdavinia said.
Studies also have shown that African-American and Hispanic children have a higher rate of emergency department visits for food allergy, compared with white children. Dr. Mahdavinia said this might be because the severity of their allergies is worse, because they have less access to primary care, they have inferior practices at home to manage the allergies, and that higher asthma rates in these children is likely leading to worse food allergy incidents.
Compared with white children, African American children were significantly more likely to have allergy to wheat, soy, corn, fish, and shellfish (P less than .01). Compared with white children, Hispanic children were significantly more likely to have allergy to corn, fish, and shellfish (P less than .01) (J Allergy Clin Immunol Pract. 2017 Mar-Apr;5[2]:352-7.e1).
Children from low-income backgrounds, she noted, spend less on specialty outpatient care.
The difference in food allergy rates is likely linked, in part, to familial and cultural differences in childrearing, she said. African-American and Hispanic parents tend to introduce solid foods earlier, and breastfeed children at lower rates than those of white families.
Dr. Mahdavinia noted that while more affluent families are able to sidestep allergies by making a simple stop at a high-end grocer to get an allergen-free version of a food, poorer families are less able to buy these more expensive alternatives.
“The higher rate of asthma anaphylaxis observed in these minority children is concerning, especially when it’s considered in the context of the reported higher rate of fatal anaphylaxis associated food allergy in African Americans,” she said. “So there’s a tremendous need for future studies.”
EXPERT ANALYSIS FROM AAAAI/WAO JOINT CONGRESS
Transporting stroke patients directly to thrombectomy boosts outcomes
LOS ANGELES – Evidence continues to mount that in the new era of thrombectomy treatment for selected acute ischemic stroke patients outcomes are better when patients go directly to the closest comprehensive stroke center that offers intravascular procedures rather than first being taken to a closer hospital and then needing transfer.
Nils H. Mueller-Kronast, MD, presented a modeled analysis of data collected in a registry on 236 real-world U.S. patients who underwent mechanical thrombectomy for an acute, large-vessel occlusion stroke following transfer from a hospital that could perform thrombolysis but couldn’t offer thrombectomy. The analysis showed that if the patients had instead gone directly to the closest thrombectomy center the result would have been a 16-percentage-point increase in patients with a modified Rankin Scale (mRS) score of 0-1 after 90 days, and a 9-percentage-point increase in mRS 0-2 outcomes, Dr. Mueller-Kronast said at the International Stroke Conference, sponsored by the American Heart Association.
The analysis he presented used data from the Systematic Evaluation of Patients Treated With Stroke Devices for Acute Ischemic Stroke (STRATIS) registry, which included 984 acute ischemic stroke patients who underwent mechanical thrombectomy at any one of 55 participating U.S. sites (Stroke. 2017 Oct;48[10]:2760-8). A previously-reported analysis of the STRATIS data showed that the 55% of patients taken directly to a center that performed thrombectomy had a 60% rate of mRS score 0-2 after 90 days, compared with 52% of patients taken first to a hospital unable to perform thrombectomy and then transferred (Circulation. 2017 Dec 12;136[24]:2311-21).
The current analysis focused on 236 of the transferred patients with complete information on their location at the time of their stroke and subsequent time intervals during their transport and treatment, including 117 patients with ground transfer from their first hospital to the thrombectomy site, 114 with air transfer, and 5 with an unreported means of transport.
Dr. Mueller-Kronast and his associates calculated the time it would have taken each of the 117 ground transported patients to have gone directly to the closest thrombectomy center (adjusted by traffic conditions at the time of the stroke), and modeled the likely outcomes of these patients based on the data collected in the registry. This projected a 47% rate of mRS scores 0-1 (good outcomes) after 90 days, and a 60% rate of mRS 0-2 scores with a direct-to-thrombectomy strategy, compared with actual rates of 31% and 51%, respectively, among the patients who were transferred from their initial hospital.
“Bypass to an endovascular-capable center may be an option to improve rapid access to mechanical thrombectomy,” he concluded.
The STRATIS registry is sponsored by Medtronic. Dr. Mueller-Kronast has been a consultant to Medtronic.
SOURCE: Mueller-Kronast N et al. Abstract LB12.
LOS ANGELES – Evidence continues to mount that in the new era of thrombectomy treatment for selected acute ischemic stroke patients outcomes are better when patients go directly to the closest comprehensive stroke center that offers intravascular procedures rather than first being taken to a closer hospital and then needing transfer.
Nils H. Mueller-Kronast, MD, presented a modeled analysis of data collected in a registry on 236 real-world U.S. patients who underwent mechanical thrombectomy for an acute, large-vessel occlusion stroke following transfer from a hospital that could perform thrombolysis but couldn’t offer thrombectomy. The analysis showed that if the patients had instead gone directly to the closest thrombectomy center the result would have been a 16-percentage-point increase in patients with a modified Rankin Scale (mRS) score of 0-1 after 90 days, and a 9-percentage-point increase in mRS 0-2 outcomes, Dr. Mueller-Kronast said at the International Stroke Conference, sponsored by the American Heart Association.
The analysis he presented used data from the Systematic Evaluation of Patients Treated With Stroke Devices for Acute Ischemic Stroke (STRATIS) registry, which included 984 acute ischemic stroke patients who underwent mechanical thrombectomy at any one of 55 participating U.S. sites (Stroke. 2017 Oct;48[10]:2760-8). A previously-reported analysis of the STRATIS data showed that the 55% of patients taken directly to a center that performed thrombectomy had a 60% rate of mRS score 0-2 after 90 days, compared with 52% of patients taken first to a hospital unable to perform thrombectomy and then transferred (Circulation. 2017 Dec 12;136[24]:2311-21).
The current analysis focused on 236 of the transferred patients with complete information on their location at the time of their stroke and subsequent time intervals during their transport and treatment, including 117 patients with ground transfer from their first hospital to the thrombectomy site, 114 with air transfer, and 5 with an unreported means of transport.
Dr. Mueller-Kronast and his associates calculated the time it would have taken each of the 117 ground transported patients to have gone directly to the closest thrombectomy center (adjusted by traffic conditions at the time of the stroke), and modeled the likely outcomes of these patients based on the data collected in the registry. This projected a 47% rate of mRS scores 0-1 (good outcomes) after 90 days, and a 60% rate of mRS 0-2 scores with a direct-to-thrombectomy strategy, compared with actual rates of 31% and 51%, respectively, among the patients who were transferred from their initial hospital.
“Bypass to an endovascular-capable center may be an option to improve rapid access to mechanical thrombectomy,” he concluded.
The STRATIS registry is sponsored by Medtronic. Dr. Mueller-Kronast has been a consultant to Medtronic.
SOURCE: Mueller-Kronast N et al. Abstract LB12.
LOS ANGELES – Evidence continues to mount that in the new era of thrombectomy treatment for selected acute ischemic stroke patients outcomes are better when patients go directly to the closest comprehensive stroke center that offers intravascular procedures rather than first being taken to a closer hospital and then needing transfer.
Nils H. Mueller-Kronast, MD, presented a modeled analysis of data collected in a registry on 236 real-world U.S. patients who underwent mechanical thrombectomy for an acute, large-vessel occlusion stroke following transfer from a hospital that could perform thrombolysis but couldn’t offer thrombectomy. The analysis showed that if the patients had instead gone directly to the closest thrombectomy center the result would have been a 16-percentage-point increase in patients with a modified Rankin Scale (mRS) score of 0-1 after 90 days, and a 9-percentage-point increase in mRS 0-2 outcomes, Dr. Mueller-Kronast said at the International Stroke Conference, sponsored by the American Heart Association.
The analysis he presented used data from the Systematic Evaluation of Patients Treated With Stroke Devices for Acute Ischemic Stroke (STRATIS) registry, which included 984 acute ischemic stroke patients who underwent mechanical thrombectomy at any one of 55 participating U.S. sites (Stroke. 2017 Oct;48[10]:2760-8). A previously-reported analysis of the STRATIS data showed that the 55% of patients taken directly to a center that performed thrombectomy had a 60% rate of mRS score 0-2 after 90 days, compared with 52% of patients taken first to a hospital unable to perform thrombectomy and then transferred (Circulation. 2017 Dec 12;136[24]:2311-21).
The current analysis focused on 236 of the transferred patients with complete information on their location at the time of their stroke and subsequent time intervals during their transport and treatment, including 117 patients with ground transfer from their first hospital to the thrombectomy site, 114 with air transfer, and 5 with an unreported means of transport.
Dr. Mueller-Kronast and his associates calculated the time it would have taken each of the 117 ground transported patients to have gone directly to the closest thrombectomy center (adjusted by traffic conditions at the time of the stroke), and modeled the likely outcomes of these patients based on the data collected in the registry. This projected a 47% rate of mRS scores 0-1 (good outcomes) after 90 days, and a 60% rate of mRS 0-2 scores with a direct-to-thrombectomy strategy, compared with actual rates of 31% and 51%, respectively, among the patients who were transferred from their initial hospital.
“Bypass to an endovascular-capable center may be an option to improve rapid access to mechanical thrombectomy,” he concluded.
The STRATIS registry is sponsored by Medtronic. Dr. Mueller-Kronast has been a consultant to Medtronic.
SOURCE: Mueller-Kronast N et al. Abstract LB12.
REPORTING FROM ISC 2018
Key clinical point: A direct-to-thrombectomy strategy maximizes good stroke outcomes.
Major finding: Modeling showed a 47% rate of good 90-day outcomes by taking patients to the closest thrombectomy center, compared with an actual 31% rate with transfers.
Study details: A simulation-model analysis of data collected by the STRATIS registry of acute stroke thrombectomies.
Disclosures: The STRATIS registry is sponsored by Medtronic. Dr. Mueller-Kronast has been a consultant to Medtronic.
Source: Mueller-Kronast N et al. Abstract LB12.
Nonmyeloablative conditioning gets a radiation boost for severe hemoglobinopathies
SALT LAKE CITY – A nonmyeloablative conditioning regimen with a boosted dose of total body irradiation yielded success for a cohort of patients with severe hemoglobinopathy and haploidentical donors.
Of 17 patients with severe sickle cell disease or beta-thalassemia who received allogeneic bone marrow transplants, all but one had successful engraftment, and 13 have achieved full donor chimerism, said Javier Bolaños-Meade, MD.
“Cure of severe hemoglobinopathies is now possible for most patients,” said Dr. Bolaños-Meade. “It should no longer be considered as available to only a fraction of such patients,” such as those who come with a fully-matched donor and those able to tolerate myeloablative conditioning, he said.
Of the patients who received bone marrow transplants, five patients have stopped immunosuppressive therapy, and all patients are alive, having been followed for a median of 15 months (range, 3-34 months).
The rate of graft versus host disease (GVHD) was low: Two patients developed grade 2 acute GVHD, and one patient developed grade 3 acute GVHD; another three patients had mild to moderate chronic GVHD, but all GVHD has resolved, said Dr. Bolaños-Meade.
Historically, the difficulties with transplant in this population were numerous. “No. 1, it’s very difficult to find an HLA-matched donor,” said Dr. Bolaños-Meade. Also, since there’s no target for graft-versus-tumor effect post-transplant, any amount of chronic GVHD is also high on the list of concerns when considering a transplant for hemoglobinopathy.
“The other problem in this group of patients is their ability to tolerate myeloablation,” he said. The accumulated burden of disease, as well as sequelae of transfusion dependence for some, may make a myeloablative regime too risky.
Dr. Bolaños-Meade said that he and his collaborators at Johns Hopkins University, Baltimore, wanted to be able to address all of these concerns in one regimen. “We were trying to work out a system that may be able to solve all the problems – to use nonmyeloablation and to use whatever donor is available.”
His research group had previously shown that nonmyeloablative transplants were well tolerated in patients with sickle cell disease and that haploidentical donors could be used (Blood. 2012 Nov 22;120[22]:4285-91). “However, we had a very high incidence of graft failure,” Dr. Bolaños-Meade said.
A strategy to increase the engraftment rate while still limiting toxicity, he said, would be to increase the dose of total body irradiation used in the conditioning regimen, from 200 to 400 centigray (cGy); this higher dose was incorporated into the study protocol.
Patients were enrolled if they had severe sickle cell disease (SCD; n = 12) or beta-thalassemia (n = 5).
To enroll in the study, SCD patients had to have been hospitalized at least twice a year in the preceding 2 years. The patients with SCD were a median 26 years of age (range, 6-31 years); four were male, and eight were female. Three of the SCD patients were transfusion dependent, and several had such serious complications as osteonecrosis, brain changes seen in medical imaging, and acute coronary syndrome.
The beta-thalassemia patients were a median 7 years of age (range, 6-16 years); all but one were female, and all had been transfusion dependent since infancy.
Bone marrow donors were not all first degree relatives: There were five mothers, four fathers, four brothers, and three sisters, but also an aunt. Two pairs had major ABO incompatibility, and five had minor ABO incompatibility. Ten were ABO compatible.
The conditioning regimen for all patients involved rabbit antithymocyte globulin, fludarabine, and cyclophosphamide, and then total body irradiation given the day before transplant.
After transplant, in addition to standard supportive care, patients received cyclophosphamide on days 3 and 4. Beginning on day 5, patients received mycophenolate mofetil through day 35 and sirolimus for 1 full year after transplant.
The antithymocyte globulin induced sickle cell crises in all SCD patients, and one patient developed sirolimus-induced diabetes. One other patient had a worsening of Meniere disease, and another patient developed BK virus cystitis.
Breaking down outcomes by disease type, Dr. Bolaños-Meade said that the one engraftment failure occurred in an SCD patient. Of the remaining 11 engrafted patients, 9 have full donor chimerism, and all but 1 of the 11 are transfusion independent now. The patient who remains transfusion dependent has mixed chimerism and received bone marrow from a donor with major ABO mismatch. Although one of the five beta-thalassemia patients also has mixed chimerism, all are now transfusion independent.
The boost in hemoglobin post-transplant was relatively modest for the beta-thalassemia group, from a median 9.5 to 10.1 g/dL at the most recent visit. However, the pretransplant levels were boosted by transfusions for all patients in this group, Dr. Bolaños-Meade pointed out.
The SCD patients saw their hemoglobin go from a median 8.65 to 11.4 g/dL (P = .001). Median bilirubin for this group dropped from 2.4 to 0.2 mg/dL (P = .002) with the cessation of sickling-related hemolysis; significant improvements were also seen in absolute reticulocyte counts and lactate dehydrogenase levels.
Dr. Bolaños-Meade reported that he is on the data safety monitoring board of Incyte.
SOURCE: Bolaños-Meade J et al. BMT Tandem Meetings, Abstract LBA-3.
SALT LAKE CITY – A nonmyeloablative conditioning regimen with a boosted dose of total body irradiation yielded success for a cohort of patients with severe hemoglobinopathy and haploidentical donors.
Of 17 patients with severe sickle cell disease or beta-thalassemia who received allogeneic bone marrow transplants, all but one had successful engraftment, and 13 have achieved full donor chimerism, said Javier Bolaños-Meade, MD.
“Cure of severe hemoglobinopathies is now possible for most patients,” said Dr. Bolaños-Meade. “It should no longer be considered as available to only a fraction of such patients,” such as those who come with a fully-matched donor and those able to tolerate myeloablative conditioning, he said.
Of the patients who received bone marrow transplants, five patients have stopped immunosuppressive therapy, and all patients are alive, having been followed for a median of 15 months (range, 3-34 months).
The rate of graft versus host disease (GVHD) was low: Two patients developed grade 2 acute GVHD, and one patient developed grade 3 acute GVHD; another three patients had mild to moderate chronic GVHD, but all GVHD has resolved, said Dr. Bolaños-Meade.
Historically, the difficulties with transplant in this population were numerous. “No. 1, it’s very difficult to find an HLA-matched donor,” said Dr. Bolaños-Meade. Also, since there’s no target for graft-versus-tumor effect post-transplant, any amount of chronic GVHD is also high on the list of concerns when considering a transplant for hemoglobinopathy.
“The other problem in this group of patients is their ability to tolerate myeloablation,” he said. The accumulated burden of disease, as well as sequelae of transfusion dependence for some, may make a myeloablative regime too risky.
Dr. Bolaños-Meade said that he and his collaborators at Johns Hopkins University, Baltimore, wanted to be able to address all of these concerns in one regimen. “We were trying to work out a system that may be able to solve all the problems – to use nonmyeloablation and to use whatever donor is available.”
His research group had previously shown that nonmyeloablative transplants were well tolerated in patients with sickle cell disease and that haploidentical donors could be used (Blood. 2012 Nov 22;120[22]:4285-91). “However, we had a very high incidence of graft failure,” Dr. Bolaños-Meade said.
A strategy to increase the engraftment rate while still limiting toxicity, he said, would be to increase the dose of total body irradiation used in the conditioning regimen, from 200 to 400 centigray (cGy); this higher dose was incorporated into the study protocol.
Patients were enrolled if they had severe sickle cell disease (SCD; n = 12) or beta-thalassemia (n = 5).
To enroll in the study, SCD patients had to have been hospitalized at least twice a year in the preceding 2 years. The patients with SCD were a median 26 years of age (range, 6-31 years); four were male, and eight were female. Three of the SCD patients were transfusion dependent, and several had such serious complications as osteonecrosis, brain changes seen in medical imaging, and acute coronary syndrome.
The beta-thalassemia patients were a median 7 years of age (range, 6-16 years); all but one were female, and all had been transfusion dependent since infancy.
Bone marrow donors were not all first degree relatives: There were five mothers, four fathers, four brothers, and three sisters, but also an aunt. Two pairs had major ABO incompatibility, and five had minor ABO incompatibility. Ten were ABO compatible.
The conditioning regimen for all patients involved rabbit antithymocyte globulin, fludarabine, and cyclophosphamide, and then total body irradiation given the day before transplant.
After transplant, in addition to standard supportive care, patients received cyclophosphamide on days 3 and 4. Beginning on day 5, patients received mycophenolate mofetil through day 35 and sirolimus for 1 full year after transplant.
The antithymocyte globulin induced sickle cell crises in all SCD patients, and one patient developed sirolimus-induced diabetes. One other patient had a worsening of Meniere disease, and another patient developed BK virus cystitis.
Breaking down outcomes by disease type, Dr. Bolaños-Meade said that the one engraftment failure occurred in an SCD patient. Of the remaining 11 engrafted patients, 9 have full donor chimerism, and all but 1 of the 11 are transfusion independent now. The patient who remains transfusion dependent has mixed chimerism and received bone marrow from a donor with major ABO mismatch. Although one of the five beta-thalassemia patients also has mixed chimerism, all are now transfusion independent.
The boost in hemoglobin post-transplant was relatively modest for the beta-thalassemia group, from a median 9.5 to 10.1 g/dL at the most recent visit. However, the pretransplant levels were boosted by transfusions for all patients in this group, Dr. Bolaños-Meade pointed out.
The SCD patients saw their hemoglobin go from a median 8.65 to 11.4 g/dL (P = .001). Median bilirubin for this group dropped from 2.4 to 0.2 mg/dL (P = .002) with the cessation of sickling-related hemolysis; significant improvements were also seen in absolute reticulocyte counts and lactate dehydrogenase levels.
Dr. Bolaños-Meade reported that he is on the data safety monitoring board of Incyte.
SOURCE: Bolaños-Meade J et al. BMT Tandem Meetings, Abstract LBA-3.
SALT LAKE CITY – A nonmyeloablative conditioning regimen with a boosted dose of total body irradiation yielded success for a cohort of patients with severe hemoglobinopathy and haploidentical donors.
Of 17 patients with severe sickle cell disease or beta-thalassemia who received allogeneic bone marrow transplants, all but one had successful engraftment, and 13 have achieved full donor chimerism, said Javier Bolaños-Meade, MD.
“Cure of severe hemoglobinopathies is now possible for most patients,” said Dr. Bolaños-Meade. “It should no longer be considered as available to only a fraction of such patients,” such as those who come with a fully-matched donor and those able to tolerate myeloablative conditioning, he said.
Of the patients who received bone marrow transplants, five patients have stopped immunosuppressive therapy, and all patients are alive, having been followed for a median of 15 months (range, 3-34 months).
The rate of graft versus host disease (GVHD) was low: Two patients developed grade 2 acute GVHD, and one patient developed grade 3 acute GVHD; another three patients had mild to moderate chronic GVHD, but all GVHD has resolved, said Dr. Bolaños-Meade.
Historically, the difficulties with transplant in this population were numerous. “No. 1, it’s very difficult to find an HLA-matched donor,” said Dr. Bolaños-Meade. Also, since there’s no target for graft-versus-tumor effect post-transplant, any amount of chronic GVHD is also high on the list of concerns when considering a transplant for hemoglobinopathy.
“The other problem in this group of patients is their ability to tolerate myeloablation,” he said. The accumulated burden of disease, as well as sequelae of transfusion dependence for some, may make a myeloablative regime too risky.
Dr. Bolaños-Meade said that he and his collaborators at Johns Hopkins University, Baltimore, wanted to be able to address all of these concerns in one regimen. “We were trying to work out a system that may be able to solve all the problems – to use nonmyeloablation and to use whatever donor is available.”
His research group had previously shown that nonmyeloablative transplants were well tolerated in patients with sickle cell disease and that haploidentical donors could be used (Blood. 2012 Nov 22;120[22]:4285-91). “However, we had a very high incidence of graft failure,” Dr. Bolaños-Meade said.
A strategy to increase the engraftment rate while still limiting toxicity, he said, would be to increase the dose of total body irradiation used in the conditioning regimen, from 200 to 400 centigray (cGy); this higher dose was incorporated into the study protocol.
Patients were enrolled if they had severe sickle cell disease (SCD; n = 12) or beta-thalassemia (n = 5).
To enroll in the study, SCD patients had to have been hospitalized at least twice a year in the preceding 2 years. The patients with SCD were a median 26 years of age (range, 6-31 years); four were male, and eight were female. Three of the SCD patients were transfusion dependent, and several had such serious complications as osteonecrosis, brain changes seen in medical imaging, and acute coronary syndrome.
The beta-thalassemia patients were a median 7 years of age (range, 6-16 years); all but one were female, and all had been transfusion dependent since infancy.
Bone marrow donors were not all first degree relatives: There were five mothers, four fathers, four brothers, and three sisters, but also an aunt. Two pairs had major ABO incompatibility, and five had minor ABO incompatibility. Ten were ABO compatible.
The conditioning regimen for all patients involved rabbit antithymocyte globulin, fludarabine, and cyclophosphamide, and then total body irradiation given the day before transplant.
After transplant, in addition to standard supportive care, patients received cyclophosphamide on days 3 and 4. Beginning on day 5, patients received mycophenolate mofetil through day 35 and sirolimus for 1 full year after transplant.
The antithymocyte globulin induced sickle cell crises in all SCD patients, and one patient developed sirolimus-induced diabetes. One other patient had a worsening of Meniere disease, and another patient developed BK virus cystitis.
Breaking down outcomes by disease type, Dr. Bolaños-Meade said that the one engraftment failure occurred in an SCD patient. Of the remaining 11 engrafted patients, 9 have full donor chimerism, and all but 1 of the 11 are transfusion independent now. The patient who remains transfusion dependent has mixed chimerism and received bone marrow from a donor with major ABO mismatch. Although one of the five beta-thalassemia patients also has mixed chimerism, all are now transfusion independent.
The boost in hemoglobin post-transplant was relatively modest for the beta-thalassemia group, from a median 9.5 to 10.1 g/dL at the most recent visit. However, the pretransplant levels were boosted by transfusions for all patients in this group, Dr. Bolaños-Meade pointed out.
The SCD patients saw their hemoglobin go from a median 8.65 to 11.4 g/dL (P = .001). Median bilirubin for this group dropped from 2.4 to 0.2 mg/dL (P = .002) with the cessation of sickling-related hemolysis; significant improvements were also seen in absolute reticulocyte counts and lactate dehydrogenase levels.
Dr. Bolaños-Meade reported that he is on the data safety monitoring board of Incyte.
SOURCE: Bolaños-Meade J et al. BMT Tandem Meetings, Abstract LBA-3.
REPORTING FROM THE 2018 BMT TANDEM MEETINGS
Key clinical point:
Major finding: Of the 17 patients who received haploidentical bone marrow transplant, 13 have achieved full chimerism.
Study details: Report of 17 consecutive patients with severe sickle cell disease or beta-thalassemia who received nonmyeloablative conditioning and bone marrow transplant from haploidentical donors.
Disclosures: Dr. Bolaños-Meade reported no outside sources of funding for the study. He is on the data safety monitoring board of Incyte.
Source: Bolaños-Meade J et al. 2018 BMT Tandem Meetings, Abstract LBA-3.
Consider steroid-induced hypertension when treating pediatric ALL
Nearly 15% of children undergoing induction therapy for acute lymphoblastic leukemia (ALL) developed and were treated for steroid-induced hypertension, according to a study conducted by researchers at the Ohio State University in Columbus.
Ian Bakk and his associates performed a retrospective review of data from the Pediatric Health Information System, a database of the Child Health Corporation of America consisting of inpatient information from 40 free-standing children’s hospitals in the United States. They looked at new cases of ALL from the period of 2009-2013 and analyzed data from 5,578 children who received induction chemotherapy for ALL. In all,
An adjusted regression analysis showed that infants less than 1 year of age had the highest odds of developing steroid-induced hypertension (adjusted odds ratio, 4.05), followed by children with abnormal glucose (aOR, 2.09), those with secondary diabetes mellitus (aOR, 1.67), and obese patients (aOR, 1.63).
“These findings can help physicians identify patients at high risk for [hypertension] at the time of ALL diagnosis,” the researchers wrote.
SOURCE: Bakk, I et al. Am J Pediatr Hematol Oncol. 2018;40:27-30.
Nearly 15% of children undergoing induction therapy for acute lymphoblastic leukemia (ALL) developed and were treated for steroid-induced hypertension, according to a study conducted by researchers at the Ohio State University in Columbus.
Ian Bakk and his associates performed a retrospective review of data from the Pediatric Health Information System, a database of the Child Health Corporation of America consisting of inpatient information from 40 free-standing children’s hospitals in the United States. They looked at new cases of ALL from the period of 2009-2013 and analyzed data from 5,578 children who received induction chemotherapy for ALL. In all,
An adjusted regression analysis showed that infants less than 1 year of age had the highest odds of developing steroid-induced hypertension (adjusted odds ratio, 4.05), followed by children with abnormal glucose (aOR, 2.09), those with secondary diabetes mellitus (aOR, 1.67), and obese patients (aOR, 1.63).
“These findings can help physicians identify patients at high risk for [hypertension] at the time of ALL diagnosis,” the researchers wrote.
SOURCE: Bakk, I et al. Am J Pediatr Hematol Oncol. 2018;40:27-30.
Nearly 15% of children undergoing induction therapy for acute lymphoblastic leukemia (ALL) developed and were treated for steroid-induced hypertension, according to a study conducted by researchers at the Ohio State University in Columbus.
Ian Bakk and his associates performed a retrospective review of data from the Pediatric Health Information System, a database of the Child Health Corporation of America consisting of inpatient information from 40 free-standing children’s hospitals in the United States. They looked at new cases of ALL from the period of 2009-2013 and analyzed data from 5,578 children who received induction chemotherapy for ALL. In all,
An adjusted regression analysis showed that infants less than 1 year of age had the highest odds of developing steroid-induced hypertension (adjusted odds ratio, 4.05), followed by children with abnormal glucose (aOR, 2.09), those with secondary diabetes mellitus (aOR, 1.67), and obese patients (aOR, 1.63).
“These findings can help physicians identify patients at high risk for [hypertension] at the time of ALL diagnosis,” the researchers wrote.
SOURCE: Bakk, I et al. Am J Pediatr Hematol Oncol. 2018;40:27-30.
Intramuscular steroid injection reduced hip OA pain up to 12 weeks
Systemic treatment with an intramuscular glucocorticoid injection is effective, compared with placebo, in reducing pain in people with hip osteoarthritis for up to 12 weeks, a double-blinded, placebo-controlled, randomized trial suggests.
However, the study found benefit with intramuscular (IM) glucocorticoid injection at 2 weeks only when patients were at rest, and did not find any significant benefit with the injection in reducing pain while walking or in reducing Western Ontario and McMaster University Osteoarthritis Index (WOMAC) pain subscale scores. The report was published in Annals of the Rheumatic Diseases.
The multicenter, double-blinded, superiority trial randomized 106 patients with painful hip OA who were not responding to oral analgesics to either 40 mg triamcinolone acetate (n = 52) or placebo injection (n = 54) into the gluteus muscle. Overall, 73 patients (68%) were women, and the average age of the cohort was 64 years. Hip OA symptoms had occurred for at least 1 year in 70% of the patients.
The study’s three primary outcomes of hip pain severity 2 weeks after the injection on a 0-10 scale at rest and during walking and on the WOMAC pain subscale revealed inconsistent results with the treatment.
At the 2-week follow-up, patients who had received the IM glucocorticoid injection had a significant and clinically relevant difference in hip pain at rest (between-group difference = –1.3; 95% confidence interval, –2.3 to –0.3; P = .01). But at this time point there were no significant associations between glucocorticoid injection and hip pain during walking (difference = –0.9; 95% CI, –1.9 to 0.1; P = .07) and WOMAC pain subscale score (difference = –6.1; 95% CI, –13.4 to 1.2; P = .10), the researchers reported.
At 2-week follow-up, recipients of the glucocorticoid injection were significantly more likely to perceive improvement (relative risk = 1.7; 95% CI, 1.1 to 2.7; P = .02) or achieve OMERACT-OARSI level of response (RR = 2.0; 95% CI, 1.1 to 3.6; P = .03).
The authors described this finding as “surprising,” speculating that the 7-point Likert scale used to measure perceived improvement could have resulted in less power.
Nineteen patients in the glucocorticoid group reported 27 nonserious adverse events, compared with 13 patients in the placebo group who reported 18 adverse events.
The authors said the greatest effects of the glucocorticoid injection were seen at 4- to 12-week follow-up (the secondary outcomes of the study), instead of at the 2-week follow-up. For example, at 4-week follow-up, the glucocorticoid injection was associated with a significant hip pain reduction at rest (between-group difference = –1.2; 95% CI, –2.1 to –0.2; P = .01) and during walking (difference = –1.1; 95% CI, –2.0 to –0.2; P = .01). At 6 weeks, the corresponding figures for hip pain reduction were –1.4 at rest (95% CI, –2.4 to –0.5; P = .005) and –1.4 while walking (95% CI, –2.3 to –0.4; P = .004). The between-group differences were still significant at 12 weeks while at rest (difference = –1.2; 95% CI, –2.3 to –0.2; P = .02) and during walking (difference = –1.3; 95% CI, –2.2 to –0.3; P = .01).
Significant differences in favor of the glucocorticoid injection overall occurred on the WOMAC subscale scores for pain, function, and stiffness, as well as total Hip disability and Osteoarthritis Outcome Score for pain and total, intermittent, and constant pain measures on the Intermittent and Constant Osteoarthritis Pain scale. At 12 weeks, the between-group difference on the WOMAC total score was –9.4 (95% CI, –17.8 to –0.9; P = .03).
The researchers said it was surprising that hip pain reduction after IM glucocorticoid injection was still present at a similar degree at 12 weeks since previous studies had shown the effect usually peaked after 1-3 weeks.
“Our findings should be replicated in future research,” they said.
“An IM glucocorticoid injection showed effectiveness in patients with hip OA on one of the three primary outcomes at a 2 weeks post injection ... The effect is probably clinically relevant,” the authors concluded.
The investigators noted that in clinical practice patients are sometimes offered multiple injections per year, whereas in the current study patients received only one injection. There has also been concern that intra-articular glucocorticoid injections could cause toxicity to chondrocytes and potentially lead to OA progression, but the effect of a single IM injection is unknown.
Financial support for the study came from the Dutch Arthritis Foundation and the NutsOhra fund. Two of the authors reported receiving grants from several pharmaceutical companies, research consortia, and foundations.
SOURCE: Dorleijn D et al. Ann Rheum Dis. 2018 March 7. doi: 10.1136/annrheumdis-2017-212628
Systemic treatment with an intramuscular glucocorticoid injection is effective, compared with placebo, in reducing pain in people with hip osteoarthritis for up to 12 weeks, a double-blinded, placebo-controlled, randomized trial suggests.
However, the study found benefit with intramuscular (IM) glucocorticoid injection at 2 weeks only when patients were at rest, and did not find any significant benefit with the injection in reducing pain while walking or in reducing Western Ontario and McMaster University Osteoarthritis Index (WOMAC) pain subscale scores. The report was published in Annals of the Rheumatic Diseases.
The multicenter, double-blinded, superiority trial randomized 106 patients with painful hip OA who were not responding to oral analgesics to either 40 mg triamcinolone acetate (n = 52) or placebo injection (n = 54) into the gluteus muscle. Overall, 73 patients (68%) were women, and the average age of the cohort was 64 years. Hip OA symptoms had occurred for at least 1 year in 70% of the patients.
The study’s three primary outcomes of hip pain severity 2 weeks after the injection on a 0-10 scale at rest and during walking and on the WOMAC pain subscale revealed inconsistent results with the treatment.
At the 2-week follow-up, patients who had received the IM glucocorticoid injection had a significant and clinically relevant difference in hip pain at rest (between-group difference = –1.3; 95% confidence interval, –2.3 to –0.3; P = .01). But at this time point there were no significant associations between glucocorticoid injection and hip pain during walking (difference = –0.9; 95% CI, –1.9 to 0.1; P = .07) and WOMAC pain subscale score (difference = –6.1; 95% CI, –13.4 to 1.2; P = .10), the researchers reported.
At 2-week follow-up, recipients of the glucocorticoid injection were significantly more likely to perceive improvement (relative risk = 1.7; 95% CI, 1.1 to 2.7; P = .02) or achieve OMERACT-OARSI level of response (RR = 2.0; 95% CI, 1.1 to 3.6; P = .03).
The authors described this finding as “surprising,” speculating that the 7-point Likert scale used to measure perceived improvement could have resulted in less power.
Nineteen patients in the glucocorticoid group reported 27 nonserious adverse events, compared with 13 patients in the placebo group who reported 18 adverse events.
The authors said the greatest effects of the glucocorticoid injection were seen at 4- to 12-week follow-up (the secondary outcomes of the study), instead of at the 2-week follow-up. For example, at 4-week follow-up, the glucocorticoid injection was associated with a significant hip pain reduction at rest (between-group difference = –1.2; 95% CI, –2.1 to –0.2; P = .01) and during walking (difference = –1.1; 95% CI, –2.0 to –0.2; P = .01). At 6 weeks, the corresponding figures for hip pain reduction were –1.4 at rest (95% CI, –2.4 to –0.5; P = .005) and –1.4 while walking (95% CI, –2.3 to –0.4; P = .004). The between-group differences were still significant at 12 weeks while at rest (difference = –1.2; 95% CI, –2.3 to –0.2; P = .02) and during walking (difference = –1.3; 95% CI, –2.2 to –0.3; P = .01).
Significant differences in favor of the glucocorticoid injection overall occurred on the WOMAC subscale scores for pain, function, and stiffness, as well as total Hip disability and Osteoarthritis Outcome Score for pain and total, intermittent, and constant pain measures on the Intermittent and Constant Osteoarthritis Pain scale. At 12 weeks, the between-group difference on the WOMAC total score was –9.4 (95% CI, –17.8 to –0.9; P = .03).
The researchers said it was surprising that hip pain reduction after IM glucocorticoid injection was still present at a similar degree at 12 weeks since previous studies had shown the effect usually peaked after 1-3 weeks.
“Our findings should be replicated in future research,” they said.
“An IM glucocorticoid injection showed effectiveness in patients with hip OA on one of the three primary outcomes at a 2 weeks post injection ... The effect is probably clinically relevant,” the authors concluded.
The investigators noted that in clinical practice patients are sometimes offered multiple injections per year, whereas in the current study patients received only one injection. There has also been concern that intra-articular glucocorticoid injections could cause toxicity to chondrocytes and potentially lead to OA progression, but the effect of a single IM injection is unknown.
Financial support for the study came from the Dutch Arthritis Foundation and the NutsOhra fund. Two of the authors reported receiving grants from several pharmaceutical companies, research consortia, and foundations.
SOURCE: Dorleijn D et al. Ann Rheum Dis. 2018 March 7. doi: 10.1136/annrheumdis-2017-212628
Systemic treatment with an intramuscular glucocorticoid injection is effective, compared with placebo, in reducing pain in people with hip osteoarthritis for up to 12 weeks, a double-blinded, placebo-controlled, randomized trial suggests.
However, the study found benefit with intramuscular (IM) glucocorticoid injection at 2 weeks only when patients were at rest, and did not find any significant benefit with the injection in reducing pain while walking or in reducing Western Ontario and McMaster University Osteoarthritis Index (WOMAC) pain subscale scores. The report was published in Annals of the Rheumatic Diseases.
The multicenter, double-blinded, superiority trial randomized 106 patients with painful hip OA who were not responding to oral analgesics to either 40 mg triamcinolone acetate (n = 52) or placebo injection (n = 54) into the gluteus muscle. Overall, 73 patients (68%) were women, and the average age of the cohort was 64 years. Hip OA symptoms had occurred for at least 1 year in 70% of the patients.
The study’s three primary outcomes of hip pain severity 2 weeks after the injection on a 0-10 scale at rest and during walking and on the WOMAC pain subscale revealed inconsistent results with the treatment.
At the 2-week follow-up, patients who had received the IM glucocorticoid injection had a significant and clinically relevant difference in hip pain at rest (between-group difference = –1.3; 95% confidence interval, –2.3 to –0.3; P = .01). But at this time point there were no significant associations between glucocorticoid injection and hip pain during walking (difference = –0.9; 95% CI, –1.9 to 0.1; P = .07) and WOMAC pain subscale score (difference = –6.1; 95% CI, –13.4 to 1.2; P = .10), the researchers reported.
At 2-week follow-up, recipients of the glucocorticoid injection were significantly more likely to perceive improvement (relative risk = 1.7; 95% CI, 1.1 to 2.7; P = .02) or achieve OMERACT-OARSI level of response (RR = 2.0; 95% CI, 1.1 to 3.6; P = .03).
The authors described this finding as “surprising,” speculating that the 7-point Likert scale used to measure perceived improvement could have resulted in less power.
Nineteen patients in the glucocorticoid group reported 27 nonserious adverse events, compared with 13 patients in the placebo group who reported 18 adverse events.
The authors said the greatest effects of the glucocorticoid injection were seen at 4- to 12-week follow-up (the secondary outcomes of the study), instead of at the 2-week follow-up. For example, at 4-week follow-up, the glucocorticoid injection was associated with a significant hip pain reduction at rest (between-group difference = –1.2; 95% CI, –2.1 to –0.2; P = .01) and during walking (difference = –1.1; 95% CI, –2.0 to –0.2; P = .01). At 6 weeks, the corresponding figures for hip pain reduction were –1.4 at rest (95% CI, –2.4 to –0.5; P = .005) and –1.4 while walking (95% CI, –2.3 to –0.4; P = .004). The between-group differences were still significant at 12 weeks while at rest (difference = –1.2; 95% CI, –2.3 to –0.2; P = .02) and during walking (difference = –1.3; 95% CI, –2.2 to –0.3; P = .01).
Significant differences in favor of the glucocorticoid injection overall occurred on the WOMAC subscale scores for pain, function, and stiffness, as well as total Hip disability and Osteoarthritis Outcome Score for pain and total, intermittent, and constant pain measures on the Intermittent and Constant Osteoarthritis Pain scale. At 12 weeks, the between-group difference on the WOMAC total score was –9.4 (95% CI, –17.8 to –0.9; P = .03).
The researchers said it was surprising that hip pain reduction after IM glucocorticoid injection was still present at a similar degree at 12 weeks since previous studies had shown the effect usually peaked after 1-3 weeks.
“Our findings should be replicated in future research,” they said.
“An IM glucocorticoid injection showed effectiveness in patients with hip OA on one of the three primary outcomes at a 2 weeks post injection ... The effect is probably clinically relevant,” the authors concluded.
The investigators noted that in clinical practice patients are sometimes offered multiple injections per year, whereas in the current study patients received only one injection. There has also been concern that intra-articular glucocorticoid injections could cause toxicity to chondrocytes and potentially lead to OA progression, but the effect of a single IM injection is unknown.
Financial support for the study came from the Dutch Arthritis Foundation and the NutsOhra fund. Two of the authors reported receiving grants from several pharmaceutical companies, research consortia, and foundations.
SOURCE: Dorleijn D et al. Ann Rheum Dis. 2018 March 7. doi: 10.1136/annrheumdis-2017-212628
FROM ANNALS OF THE RHEUMATIC DISEASES
Key clinical point:
Major finding: At 2 weeks, patients who had received the intramuscular glucocorticoid injection had a significant and clinically relevant difference in hip pain at rest (between-group difference = –1.3; 95% CI, –2.3 to –0.3; P = .01).
Study details: A 12-week, double blinded, placebo-controlled trial of 106 patients with hip OA randomized to 40 mg triamcinolone acetate or placebo injection.
Disclosures: Financial support for the study came from the Dutch Arthritis Foundation and the NutsOhra fund. Two of the authors reported receiving grants from several pharmaceutical companies, research consortia, and foundations.
Source: Dorleijn D et al. Ann Rheum Dis. 2018 Mar 7. doi: 10.1136/annrheumdis-2017-212628.
Swamp coolers not linked to dust mite sensitization in atopic children
ORLANDO – Swamp coolers – a low-cost alternative to air-conditioning in dry regions – weren’t found to increase sensitization to house dust mites or mold in atopic pediatric patients, researchers reported.
Neema Izadi, MD, and his associates say the findings, seen in a pediatric Colorado population in a study evaluating data over 10 years, could mean that not everyone at risk of dust mite and mold sensitization needs to avoid these cooling systems.
“Evaporative coolers have been shown to raise relative humidity by about 10%,” said Dr. Izadi, a pediatric allergy and immunology fellow at National Jewish Health, Denver, presenting at the joint congress of the American Academy of Asthma, Allergy and Immunology and the World Asthma Organization. “They work best in environments where the air is very warm and dry.”
House dust mites and mold thrive in higher humidity. Small studies performed in Colorado, Utah, and other locations have shown that the swamp coolers increase house dust mite allergen content, but there have been very few studies that have looked at actual sensitization. One smaller study in Nevada did find that the coolers increased sensitization to dust mites and mold.
In this study – thought to be the largest ever to look at this question – Dr. Izadi and his colleagues assessed data on patients aged 21 years and younger who were seen at National Jewish Health during 2008-2017 and who had at least one positive environmental skin-prick test. The average age was about 9 years. The cohort included 8,503 patients with sensitization to house dust mites and 9,286 with sensitization to mold. Researchers examined data on swamp coolers in their homes.
The researchers found that 29% of those with swamp coolers were dust-mite positive on skin testing, and 28% of those without one were positive. This was not a significant difference (P = .85). They found that 45% of those with the coolers were positive for sensitization to any mold, compared with 44% without one – also not a significant difference (P = .43).
They also found no difference according to age group, sex, or individually for atopic dermatitis, asthma, or allergic rhinitis.
He acknowledged that the study had no way to reliably account for patients who were transplants to Colorado, having moved there from somewhere else. The study also didn’t examine the age of homes, whether it had carpeting, or other factors.
He noted that the amount of time the coolers were run in the home was not examined and that “it might matter how much it is on.” This, he said, might account for differences in these results, compared with the Nevada study that did find a sensitization increase cause by the coolers.
“Evaporative coolers or swamp coolers are a great low-cost alternative in semiarid and arid environments – they can cut costs from 15%-35%,” Dr. Izadi said. “These data may indicate that it may be unnecessary to recommend that patients remove their swamp cooler, at least from a dust-mite and mold-sensitization standpoint.”
Dr. Izadi had no relevant financial disclosures.
SOURCE: Izadi N et al. AAAAI/WAO Joint Congress, Abstract 586
Susan Millard, MD, FCCP, comments: Swamp coolers are used in semi-arid and arid climates like Arizona, where I did my fellowship training but they didn't work well to keep apartments and homes cool enough if over about 100°F outside! The system is cheaper than air conditioning. So it is great to know that this type of cooling system does not cause more mold and dust mite allergies.
Susan Millard, MD, FCCP, comments: Swamp coolers are used in semi-arid and arid climates like Arizona, where I did my fellowship training but they didn't work well to keep apartments and homes cool enough if over about 100°F outside! The system is cheaper than air conditioning. So it is great to know that this type of cooling system does not cause more mold and dust mite allergies.
Susan Millard, MD, FCCP, comments: Swamp coolers are used in semi-arid and arid climates like Arizona, where I did my fellowship training but they didn't work well to keep apartments and homes cool enough if over about 100°F outside! The system is cheaper than air conditioning. So it is great to know that this type of cooling system does not cause more mold and dust mite allergies.
ORLANDO – Swamp coolers – a low-cost alternative to air-conditioning in dry regions – weren’t found to increase sensitization to house dust mites or mold in atopic pediatric patients, researchers reported.
Neema Izadi, MD, and his associates say the findings, seen in a pediatric Colorado population in a study evaluating data over 10 years, could mean that not everyone at risk of dust mite and mold sensitization needs to avoid these cooling systems.
“Evaporative coolers have been shown to raise relative humidity by about 10%,” said Dr. Izadi, a pediatric allergy and immunology fellow at National Jewish Health, Denver, presenting at the joint congress of the American Academy of Asthma, Allergy and Immunology and the World Asthma Organization. “They work best in environments where the air is very warm and dry.”
House dust mites and mold thrive in higher humidity. Small studies performed in Colorado, Utah, and other locations have shown that the swamp coolers increase house dust mite allergen content, but there have been very few studies that have looked at actual sensitization. One smaller study in Nevada did find that the coolers increased sensitization to dust mites and mold.
In this study – thought to be the largest ever to look at this question – Dr. Izadi and his colleagues assessed data on patients aged 21 years and younger who were seen at National Jewish Health during 2008-2017 and who had at least one positive environmental skin-prick test. The average age was about 9 years. The cohort included 8,503 patients with sensitization to house dust mites and 9,286 with sensitization to mold. Researchers examined data on swamp coolers in their homes.
The researchers found that 29% of those with swamp coolers were dust-mite positive on skin testing, and 28% of those without one were positive. This was not a significant difference (P = .85). They found that 45% of those with the coolers were positive for sensitization to any mold, compared with 44% without one – also not a significant difference (P = .43).
They also found no difference according to age group, sex, or individually for atopic dermatitis, asthma, or allergic rhinitis.
He acknowledged that the study had no way to reliably account for patients who were transplants to Colorado, having moved there from somewhere else. The study also didn’t examine the age of homes, whether it had carpeting, or other factors.
He noted that the amount of time the coolers were run in the home was not examined and that “it might matter how much it is on.” This, he said, might account for differences in these results, compared with the Nevada study that did find a sensitization increase cause by the coolers.
“Evaporative coolers or swamp coolers are a great low-cost alternative in semiarid and arid environments – they can cut costs from 15%-35%,” Dr. Izadi said. “These data may indicate that it may be unnecessary to recommend that patients remove their swamp cooler, at least from a dust-mite and mold-sensitization standpoint.”
Dr. Izadi had no relevant financial disclosures.
SOURCE: Izadi N et al. AAAAI/WAO Joint Congress, Abstract 586
ORLANDO – Swamp coolers – a low-cost alternative to air-conditioning in dry regions – weren’t found to increase sensitization to house dust mites or mold in atopic pediatric patients, researchers reported.
Neema Izadi, MD, and his associates say the findings, seen in a pediatric Colorado population in a study evaluating data over 10 years, could mean that not everyone at risk of dust mite and mold sensitization needs to avoid these cooling systems.
“Evaporative coolers have been shown to raise relative humidity by about 10%,” said Dr. Izadi, a pediatric allergy and immunology fellow at National Jewish Health, Denver, presenting at the joint congress of the American Academy of Asthma, Allergy and Immunology and the World Asthma Organization. “They work best in environments where the air is very warm and dry.”
House dust mites and mold thrive in higher humidity. Small studies performed in Colorado, Utah, and other locations have shown that the swamp coolers increase house dust mite allergen content, but there have been very few studies that have looked at actual sensitization. One smaller study in Nevada did find that the coolers increased sensitization to dust mites and mold.
In this study – thought to be the largest ever to look at this question – Dr. Izadi and his colleagues assessed data on patients aged 21 years and younger who were seen at National Jewish Health during 2008-2017 and who had at least one positive environmental skin-prick test. The average age was about 9 years. The cohort included 8,503 patients with sensitization to house dust mites and 9,286 with sensitization to mold. Researchers examined data on swamp coolers in their homes.
The researchers found that 29% of those with swamp coolers were dust-mite positive on skin testing, and 28% of those without one were positive. This was not a significant difference (P = .85). They found that 45% of those with the coolers were positive for sensitization to any mold, compared with 44% without one – also not a significant difference (P = .43).
They also found no difference according to age group, sex, or individually for atopic dermatitis, asthma, or allergic rhinitis.
He acknowledged that the study had no way to reliably account for patients who were transplants to Colorado, having moved there from somewhere else. The study also didn’t examine the age of homes, whether it had carpeting, or other factors.
He noted that the amount of time the coolers were run in the home was not examined and that “it might matter how much it is on.” This, he said, might account for differences in these results, compared with the Nevada study that did find a sensitization increase cause by the coolers.
“Evaporative coolers or swamp coolers are a great low-cost alternative in semiarid and arid environments – they can cut costs from 15%-35%,” Dr. Izadi said. “These data may indicate that it may be unnecessary to recommend that patients remove their swamp cooler, at least from a dust-mite and mold-sensitization standpoint.”
Dr. Izadi had no relevant financial disclosures.
SOURCE: Izadi N et al. AAAAI/WAO Joint Congress, Abstract 586
REPORTING FROM AAAAI/WAO JOINT CONGRESS
Key clinical point:
Major finding: Researchers found that 29% of those with swamp coolers were dust-mite positive on skin testing, and 28% of those without one were as well. This was not a significant difference (P = .85).
Study details: A retrospective review of more than 17,000 cases of atopic children aged 21 years and younger who were seen at National Jewish Health and had a positive environmental skin prick test.
Disclosures: Dr. Izadi had no relevant financial disclosures.
Source: Izadi N et al. AAAAI/WAO Joint Congress, Abstract 586
MDedge Daily News: Have ‘The Talk’ about medical marijuana
inflammatory bowel disease severity, diabetes drives the nation’s drug bills, and there’s pushback against new blood sugar targets.
, uniformity comes toListen to the MDedge Daily News podcast for all the details on today’s top news.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
inflammatory bowel disease severity, diabetes drives the nation’s drug bills, and there’s pushback against new blood sugar targets.
, uniformity comes toListen to the MDedge Daily News podcast for all the details on today’s top news.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
inflammatory bowel disease severity, diabetes drives the nation’s drug bills, and there’s pushback against new blood sugar targets.
, uniformity comes toListen to the MDedge Daily News podcast for all the details on today’s top news.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
Does Your Work Trigger Asthma?
As many as 1 in 5 asthma-related deaths in the US is due to occupational exposure—and many could be prevented, according to CDC researchers.
The researchers analyzed reports from 1999-2016 of asthma-related mortality and occupations of the people involved. Of 3,396 deaths (3,396 in 2015 alone), between 11% and 21% were due to occupational exposures. Health care workers and construction workers were at highest risk.
By industry, the highest number of deaths were among men working in construction (13%) and women in health care (14%). By occupation, the most deaths were among men construction trades workers (11%) and women office and administrative support workers (9%).
The researchers note that ongoing exposure to cleaners, disinfectants, and antibiotics, all can trigger asthma. But they also point out that steps can be successfully taken to limit the type of exposure that exacerbates asthma symptoms, such as replacing powdered latex gloves with powder-free natural rubber latex or nonlatex gloves.
In an interview with MD Magazine, principle investigator Jacek Mazurek, MD, PhD, said there’s also an opportunity for health care providers to intervene more effectively. “Inadequate screening of workers for occupational exposures by health providers and lack of recognition of associations between workplace exposures and asthma symptoms remain the main reasons for underrecognition and underdiagnosis of work-related asthma.”
The Occupational Safety and Health Administration offers guidance for diagnosing work-related asthma at https://www.osha.gov/SLTC/occupationalasthma/.
As many as 1 in 5 asthma-related deaths in the US is due to occupational exposure—and many could be prevented, according to CDC researchers.
The researchers analyzed reports from 1999-2016 of asthma-related mortality and occupations of the people involved. Of 3,396 deaths (3,396 in 2015 alone), between 11% and 21% were due to occupational exposures. Health care workers and construction workers were at highest risk.
By industry, the highest number of deaths were among men working in construction (13%) and women in health care (14%). By occupation, the most deaths were among men construction trades workers (11%) and women office and administrative support workers (9%).
The researchers note that ongoing exposure to cleaners, disinfectants, and antibiotics, all can trigger asthma. But they also point out that steps can be successfully taken to limit the type of exposure that exacerbates asthma symptoms, such as replacing powdered latex gloves with powder-free natural rubber latex or nonlatex gloves.
In an interview with MD Magazine, principle investigator Jacek Mazurek, MD, PhD, said there’s also an opportunity for health care providers to intervene more effectively. “Inadequate screening of workers for occupational exposures by health providers and lack of recognition of associations between workplace exposures and asthma symptoms remain the main reasons for underrecognition and underdiagnosis of work-related asthma.”
The Occupational Safety and Health Administration offers guidance for diagnosing work-related asthma at https://www.osha.gov/SLTC/occupationalasthma/.
As many as 1 in 5 asthma-related deaths in the US is due to occupational exposure—and many could be prevented, according to CDC researchers.
The researchers analyzed reports from 1999-2016 of asthma-related mortality and occupations of the people involved. Of 3,396 deaths (3,396 in 2015 alone), between 11% and 21% were due to occupational exposures. Health care workers and construction workers were at highest risk.
By industry, the highest number of deaths were among men working in construction (13%) and women in health care (14%). By occupation, the most deaths were among men construction trades workers (11%) and women office and administrative support workers (9%).
The researchers note that ongoing exposure to cleaners, disinfectants, and antibiotics, all can trigger asthma. But they also point out that steps can be successfully taken to limit the type of exposure that exacerbates asthma symptoms, such as replacing powdered latex gloves with powder-free natural rubber latex or nonlatex gloves.
In an interview with MD Magazine, principle investigator Jacek Mazurek, MD, PhD, said there’s also an opportunity for health care providers to intervene more effectively. “Inadequate screening of workers for occupational exposures by health providers and lack of recognition of associations between workplace exposures and asthma symptoms remain the main reasons for underrecognition and underdiagnosis of work-related asthma.”
The Occupational Safety and Health Administration offers guidance for diagnosing work-related asthma at https://www.osha.gov/SLTC/occupationalasthma/.
Reversal agent exhibits efficacy in patients with major bleeding
ORLANDO—Interim trial results suggest andexanet alfa can reverse the activity of factor Xa inhibitors in patients with acute major bleeding.
Andexanet alfa reduced median anti-factor Xa inhibitor activity by 91% in patients taking apixaban, 88% in those taking rivaroxaban, and 75% in patients taking enoxaparin.
Eleven percent of patients experienced thrombotic events, and 12% of patients died.
Stuart J. Connolly, MD, of McMaster University in Hamilton, Ontario, Canada, presented these results at the American College of Cardiology’s 67th Annual Scientific Session & Expo (ACC.18, abstract 409-14).
The trial, known as ANNEXA-4, was sponsored by Portola Pharmaceuticals, Inc.
“These data are particularly compelling when you consider the high-risk profile of the ANNEXA-4 population, which includes a substantial number of elderly patients presenting with intracranial hemorrhage and anticoagulated for venous thromboembolism, and the lack of any FDA- or EMA-approved reversal agent for these patients,” Dr Connolly said.
“The interim efficacy and safety data continue to support the promising role of AndexXa [the brand name for andexanet alfa] as an antidote to reverse anticoagulation in factor Xa-associated bleeding.”
Andexanet alfa is a recombinant modified factor Xa molecule designed to bind to and disable factor Xa inhibitors, thereby allowing factor Xa produced by the body to play its normal role in the formation of blood clots.
In earlier trials of healthy volunteers, andexanet alfa reversed the anticoagulant effect of factor Xa inhibitors without any significant safety problems.
ANNEXA-4 is an ongoing trial of andexanet alfa in patients experiencing major bleeding while taking factor Xa inhibitors.
Dr Connolly presented safety outcomes for 227 trial subjects and adjudicated efficacy outcomes for 132 subjects. All subjects presented with acute major bleeding within 18 hours of taking a factor Xa inhibitor, including apixaban, rivaroxaban, enoxaparin, and edoxaban.
The patients received andexanet alfa given as a bolus dose over 20 to 30 minutes, followed by a 2-hour (120 minute) infusion. The dosage was determined based on the specific factor Xa inhibitor the patients were taking and how long it had been since their last dose.
The patients were evaluated for 30 days after andexanet alfa administration.
Efficacy
The median age of the efficacy population (n=137) was 77, and 51% (n=70) were male. Indications for anticoagulation included atrial fibrillation (Afib, 76%, n=104), venous thromboembolism (VTE, 28%, n=38), and both Afib and VTE (4%, n=6).
Patients were receiving apixaban (n=68), rivaroxaban (n=54), or enoxaparin (n=10). None of these patients had received edoxaban.
Types of bleeding included intracranial hemorrhage (ICH, 57%, n=78), gastrointestinal (GI) bleeding (31%, n=43), and “other” bleeding (12%, n=16).
The researchers assessed andexanet alfa’s efficacy in terms of 2 co-primary endpoints—reduction in anti-factor Xa inhibitor activity and achievement of clinical hemostasis by 12 hours after administration. Hemostatic efficacy was assessed by an independent endpoint adjudication committee as “excellent,” “good,” or “poor/none.”
After the bolus dose of andexanet alfa, the median anti-factor Xa inhibitor activity was reduced by 91% for patients taking apixaban, 88% for those taking rivaroxaban, and 75% for those taking enoxaparin.
For patients taking rivaroxaban, the median anti-factor Xa inhibitor activity was reduced by 88% at the end of the bolus dose, 87% at the end of the infusion, 42% at 4 hours, 49% at 8 hours, and 60% at 12 hours.
For patients taking apixaban, the median anti-factor Xa inhibitor activity was reduced by 91% at the end of the bolus dose, 91% at the end of the infusion, 36% at 4 hours, 30% at 8 hours, and 35% at 12 hours.
Overall, 83% of patients had “excellent” or “good” clinical hemostasis, and 17% had “poor/none.”
Hemostasis was deemed excellent or good in 83% of patients on rivaroxaban, 82% of those on apixaban, and 80% of those on enoxaparin. It was excellent/good in 86% of patients with GI bleeding, 81% of patients with ICH, and 80% of patients with bleeding at other sites.
Safety
The median age of the safety population (n=227) was 77, and 52% (n=117) were male. Indications for anticoagulation included Afib (78%, n=178), VTE (23%, n=52), and both Afib and VTE (4%, n=8).
Patients were receiving apixaban (n=117), rivaroxaban (n=90), enoxaparin (n=17), and edoxaban (n=3). Types of bleeding included ICH (61%, n=139), GI (27%, n=62), and “other” (12%, n=26).
During the 30-day follow-up period, the rate of thrombotic events was 11% (n=24) for the entire population and 12% (n=17) among patients with ICH.
The mortality rate for all patients was 12% (n=27). Eleven deaths were due to cardiovascular causes.
According to Dr Connolly, these rates of adverse events are in line with what would be expected given the underlying medical condition of the patients in the trial and the fact that many (43%) had not resumed anticoagulant treatment in the 30-day follow-up period.
Two patients experienced an infusion reaction.
None of the patients developed antibodies to factor Xa or factor X, and there were no neutralizing antibodies to andexanet alfa.
ORLANDO—Interim trial results suggest andexanet alfa can reverse the activity of factor Xa inhibitors in patients with acute major bleeding.
Andexanet alfa reduced median anti-factor Xa inhibitor activity by 91% in patients taking apixaban, 88% in those taking rivaroxaban, and 75% in patients taking enoxaparin.
Eleven percent of patients experienced thrombotic events, and 12% of patients died.
Stuart J. Connolly, MD, of McMaster University in Hamilton, Ontario, Canada, presented these results at the American College of Cardiology’s 67th Annual Scientific Session & Expo (ACC.18, abstract 409-14).
The trial, known as ANNEXA-4, was sponsored by Portola Pharmaceuticals, Inc.
“These data are particularly compelling when you consider the high-risk profile of the ANNEXA-4 population, which includes a substantial number of elderly patients presenting with intracranial hemorrhage and anticoagulated for venous thromboembolism, and the lack of any FDA- or EMA-approved reversal agent for these patients,” Dr Connolly said.
“The interim efficacy and safety data continue to support the promising role of AndexXa [the brand name for andexanet alfa] as an antidote to reverse anticoagulation in factor Xa-associated bleeding.”
Andexanet alfa is a recombinant modified factor Xa molecule designed to bind to and disable factor Xa inhibitors, thereby allowing factor Xa produced by the body to play its normal role in the formation of blood clots.
In earlier trials of healthy volunteers, andexanet alfa reversed the anticoagulant effect of factor Xa inhibitors without any significant safety problems.
ANNEXA-4 is an ongoing trial of andexanet alfa in patients experiencing major bleeding while taking factor Xa inhibitors.
Dr Connolly presented safety outcomes for 227 trial subjects and adjudicated efficacy outcomes for 132 subjects. All subjects presented with acute major bleeding within 18 hours of taking a factor Xa inhibitor, including apixaban, rivaroxaban, enoxaparin, and edoxaban.
The patients received andexanet alfa given as a bolus dose over 20 to 30 minutes, followed by a 2-hour (120 minute) infusion. The dosage was determined based on the specific factor Xa inhibitor the patients were taking and how long it had been since their last dose.
The patients were evaluated for 30 days after andexanet alfa administration.
Efficacy
The median age of the efficacy population (n=137) was 77, and 51% (n=70) were male. Indications for anticoagulation included atrial fibrillation (Afib, 76%, n=104), venous thromboembolism (VTE, 28%, n=38), and both Afib and VTE (4%, n=6).
Patients were receiving apixaban (n=68), rivaroxaban (n=54), or enoxaparin (n=10). None of these patients had received edoxaban.
Types of bleeding included intracranial hemorrhage (ICH, 57%, n=78), gastrointestinal (GI) bleeding (31%, n=43), and “other” bleeding (12%, n=16).
The researchers assessed andexanet alfa’s efficacy in terms of 2 co-primary endpoints—reduction in anti-factor Xa inhibitor activity and achievement of clinical hemostasis by 12 hours after administration. Hemostatic efficacy was assessed by an independent endpoint adjudication committee as “excellent,” “good,” or “poor/none.”
After the bolus dose of andexanet alfa, the median anti-factor Xa inhibitor activity was reduced by 91% for patients taking apixaban, 88% for those taking rivaroxaban, and 75% for those taking enoxaparin.
For patients taking rivaroxaban, the median anti-factor Xa inhibitor activity was reduced by 88% at the end of the bolus dose, 87% at the end of the infusion, 42% at 4 hours, 49% at 8 hours, and 60% at 12 hours.
For patients taking apixaban, the median anti-factor Xa inhibitor activity was reduced by 91% at the end of the bolus dose, 91% at the end of the infusion, 36% at 4 hours, 30% at 8 hours, and 35% at 12 hours.
Overall, 83% of patients had “excellent” or “good” clinical hemostasis, and 17% had “poor/none.”
Hemostasis was deemed excellent or good in 83% of patients on rivaroxaban, 82% of those on apixaban, and 80% of those on enoxaparin. It was excellent/good in 86% of patients with GI bleeding, 81% of patients with ICH, and 80% of patients with bleeding at other sites.
Safety
The median age of the safety population (n=227) was 77, and 52% (n=117) were male. Indications for anticoagulation included Afib (78%, n=178), VTE (23%, n=52), and both Afib and VTE (4%, n=8).
Patients were receiving apixaban (n=117), rivaroxaban (n=90), enoxaparin (n=17), and edoxaban (n=3). Types of bleeding included ICH (61%, n=139), GI (27%, n=62), and “other” (12%, n=26).
During the 30-day follow-up period, the rate of thrombotic events was 11% (n=24) for the entire population and 12% (n=17) among patients with ICH.
The mortality rate for all patients was 12% (n=27). Eleven deaths were due to cardiovascular causes.
According to Dr Connolly, these rates of adverse events are in line with what would be expected given the underlying medical condition of the patients in the trial and the fact that many (43%) had not resumed anticoagulant treatment in the 30-day follow-up period.
Two patients experienced an infusion reaction.
None of the patients developed antibodies to factor Xa or factor X, and there were no neutralizing antibodies to andexanet alfa.
ORLANDO—Interim trial results suggest andexanet alfa can reverse the activity of factor Xa inhibitors in patients with acute major bleeding.
Andexanet alfa reduced median anti-factor Xa inhibitor activity by 91% in patients taking apixaban, 88% in those taking rivaroxaban, and 75% in patients taking enoxaparin.
Eleven percent of patients experienced thrombotic events, and 12% of patients died.
Stuart J. Connolly, MD, of McMaster University in Hamilton, Ontario, Canada, presented these results at the American College of Cardiology’s 67th Annual Scientific Session & Expo (ACC.18, abstract 409-14).
The trial, known as ANNEXA-4, was sponsored by Portola Pharmaceuticals, Inc.
“These data are particularly compelling when you consider the high-risk profile of the ANNEXA-4 population, which includes a substantial number of elderly patients presenting with intracranial hemorrhage and anticoagulated for venous thromboembolism, and the lack of any FDA- or EMA-approved reversal agent for these patients,” Dr Connolly said.
“The interim efficacy and safety data continue to support the promising role of AndexXa [the brand name for andexanet alfa] as an antidote to reverse anticoagulation in factor Xa-associated bleeding.”
Andexanet alfa is a recombinant modified factor Xa molecule designed to bind to and disable factor Xa inhibitors, thereby allowing factor Xa produced by the body to play its normal role in the formation of blood clots.
In earlier trials of healthy volunteers, andexanet alfa reversed the anticoagulant effect of factor Xa inhibitors without any significant safety problems.
ANNEXA-4 is an ongoing trial of andexanet alfa in patients experiencing major bleeding while taking factor Xa inhibitors.
Dr Connolly presented safety outcomes for 227 trial subjects and adjudicated efficacy outcomes for 132 subjects. All subjects presented with acute major bleeding within 18 hours of taking a factor Xa inhibitor, including apixaban, rivaroxaban, enoxaparin, and edoxaban.
The patients received andexanet alfa given as a bolus dose over 20 to 30 minutes, followed by a 2-hour (120 minute) infusion. The dosage was determined based on the specific factor Xa inhibitor the patients were taking and how long it had been since their last dose.
The patients were evaluated for 30 days after andexanet alfa administration.
Efficacy
The median age of the efficacy population (n=137) was 77, and 51% (n=70) were male. Indications for anticoagulation included atrial fibrillation (Afib, 76%, n=104), venous thromboembolism (VTE, 28%, n=38), and both Afib and VTE (4%, n=6).
Patients were receiving apixaban (n=68), rivaroxaban (n=54), or enoxaparin (n=10). None of these patients had received edoxaban.
Types of bleeding included intracranial hemorrhage (ICH, 57%, n=78), gastrointestinal (GI) bleeding (31%, n=43), and “other” bleeding (12%, n=16).
The researchers assessed andexanet alfa’s efficacy in terms of 2 co-primary endpoints—reduction in anti-factor Xa inhibitor activity and achievement of clinical hemostasis by 12 hours after administration. Hemostatic efficacy was assessed by an independent endpoint adjudication committee as “excellent,” “good,” or “poor/none.”
After the bolus dose of andexanet alfa, the median anti-factor Xa inhibitor activity was reduced by 91% for patients taking apixaban, 88% for those taking rivaroxaban, and 75% for those taking enoxaparin.
For patients taking rivaroxaban, the median anti-factor Xa inhibitor activity was reduced by 88% at the end of the bolus dose, 87% at the end of the infusion, 42% at 4 hours, 49% at 8 hours, and 60% at 12 hours.
For patients taking apixaban, the median anti-factor Xa inhibitor activity was reduced by 91% at the end of the bolus dose, 91% at the end of the infusion, 36% at 4 hours, 30% at 8 hours, and 35% at 12 hours.
Overall, 83% of patients had “excellent” or “good” clinical hemostasis, and 17% had “poor/none.”
Hemostasis was deemed excellent or good in 83% of patients on rivaroxaban, 82% of those on apixaban, and 80% of those on enoxaparin. It was excellent/good in 86% of patients with GI bleeding, 81% of patients with ICH, and 80% of patients with bleeding at other sites.
Safety
The median age of the safety population (n=227) was 77, and 52% (n=117) were male. Indications for anticoagulation included Afib (78%, n=178), VTE (23%, n=52), and both Afib and VTE (4%, n=8).
Patients were receiving apixaban (n=117), rivaroxaban (n=90), enoxaparin (n=17), and edoxaban (n=3). Types of bleeding included ICH (61%, n=139), GI (27%, n=62), and “other” (12%, n=26).
During the 30-day follow-up period, the rate of thrombotic events was 11% (n=24) for the entire population and 12% (n=17) among patients with ICH.
The mortality rate for all patients was 12% (n=27). Eleven deaths were due to cardiovascular causes.
According to Dr Connolly, these rates of adverse events are in line with what would be expected given the underlying medical condition of the patients in the trial and the fact that many (43%) had not resumed anticoagulant treatment in the 30-day follow-up period.
Two patients experienced an infusion reaction.
None of the patients developed antibodies to factor Xa or factor X, and there were no neutralizing antibodies to andexanet alfa.
Heme implicated in adverse transfusion outcomes
Free heme plays a key role in the adverse effects associated with massive transfusion of stored red blood cells (RBCs), according to researchers.
Experiments in a mouse model of trauma hemorrhage revealed a greater risk of mortality from bacterial pneumonia in mice that received transfusions of blood stored for 14 days, rather than fresh blood.
This greater risk was dependent upon free heme, which is released from RBCs during storage and upon transfusion.
In a study of human trauma patients, researchers found the amount of heme was proportional to the amount of blood transfused.
Brant M. Wagener, MD, PhD, of University of Alabama at Birmingham, and his colleagues detailed these findings in PLOS Medicine.
In the mouse model of trauma hemorrhage, the researchers resuscitated mice using either fresh blood (stored for 0 days) or blood stored for 2 weeks. (A 2-week storage of mouse blood approximates storage of human RBCs for 42 days.)
Two days after transfusion, the mice were challenged by instilling the lungs with the bacteria Pseudomonas aeruginosa.
Mice that received the stored blood had a significant increase in bacterial lung injury, as shown by higher mortality, and increased fluid accumulation and bacterial numbers in the lungs.
The researchers identified the connection between free heme and infection susceptibility/severity in 2 ways.
First, Pseudomonas aeruginosa-induced mortality was completely prevented by the addition of hemopexin, a scavenging protein that removes free heme from the blood.
Second, adding an inhibitor of toll-like receptor 4 (TLR4), or genetically removing TLR4 from mice, also prevented bacteria-induced mortality. Free heme—which is known to induce inflammatory injury to major organs in diseases like sickle cell or sepsis—acts, in part, by activating TLR4.
The researchers also found that transfusion with stored blood induced release of the inflammation mediator high mobility group box 1 (HMGB1). But an anti-HMGB1 antibody protected mice from bacteria-induced mortality.
The anti-HMGB1 antibody also restored macrophage-dependent phagocytosis of Pseudomonas aeruginosa in vitro.
Tissue culture experiments had revealed that free heme inhibits macrophages from ingesting Pseudomonas aeruginosa, and the addition of free heme increases permeability in endothelial cells.
Finally, in a 16-month study, the researchers found that human trauma-hemorrhage patients who received large amounts of transfused blood were also receiving amounts of free heme sufficient to overwhelm the normal amounts of hemopexin found in a person’s blood.
The researchers said this work underscores the need to confirm whether the storage age of transfused RBCs correlates with increasing levels of free heme after transfusion. The team would also like to establish whether patients with low ratios of hemopexin to free heme have a greater risk for adverse outcomes after massive transfusions.
Free heme plays a key role in the adverse effects associated with massive transfusion of stored red blood cells (RBCs), according to researchers.
Experiments in a mouse model of trauma hemorrhage revealed a greater risk of mortality from bacterial pneumonia in mice that received transfusions of blood stored for 14 days, rather than fresh blood.
This greater risk was dependent upon free heme, which is released from RBCs during storage and upon transfusion.
In a study of human trauma patients, researchers found the amount of heme was proportional to the amount of blood transfused.
Brant M. Wagener, MD, PhD, of University of Alabama at Birmingham, and his colleagues detailed these findings in PLOS Medicine.
In the mouse model of trauma hemorrhage, the researchers resuscitated mice using either fresh blood (stored for 0 days) or blood stored for 2 weeks. (A 2-week storage of mouse blood approximates storage of human RBCs for 42 days.)
Two days after transfusion, the mice were challenged by instilling the lungs with the bacteria Pseudomonas aeruginosa.
Mice that received the stored blood had a significant increase in bacterial lung injury, as shown by higher mortality, and increased fluid accumulation and bacterial numbers in the lungs.
The researchers identified the connection between free heme and infection susceptibility/severity in 2 ways.
First, Pseudomonas aeruginosa-induced mortality was completely prevented by the addition of hemopexin, a scavenging protein that removes free heme from the blood.
Second, adding an inhibitor of toll-like receptor 4 (TLR4), or genetically removing TLR4 from mice, also prevented bacteria-induced mortality. Free heme—which is known to induce inflammatory injury to major organs in diseases like sickle cell or sepsis—acts, in part, by activating TLR4.
The researchers also found that transfusion with stored blood induced release of the inflammation mediator high mobility group box 1 (HMGB1). But an anti-HMGB1 antibody protected mice from bacteria-induced mortality.
The anti-HMGB1 antibody also restored macrophage-dependent phagocytosis of Pseudomonas aeruginosa in vitro.
Tissue culture experiments had revealed that free heme inhibits macrophages from ingesting Pseudomonas aeruginosa, and the addition of free heme increases permeability in endothelial cells.
Finally, in a 16-month study, the researchers found that human trauma-hemorrhage patients who received large amounts of transfused blood were also receiving amounts of free heme sufficient to overwhelm the normal amounts of hemopexin found in a person’s blood.
The researchers said this work underscores the need to confirm whether the storage age of transfused RBCs correlates with increasing levels of free heme after transfusion. The team would also like to establish whether patients with low ratios of hemopexin to free heme have a greater risk for adverse outcomes after massive transfusions.
Free heme plays a key role in the adverse effects associated with massive transfusion of stored red blood cells (RBCs), according to researchers.
Experiments in a mouse model of trauma hemorrhage revealed a greater risk of mortality from bacterial pneumonia in mice that received transfusions of blood stored for 14 days, rather than fresh blood.
This greater risk was dependent upon free heme, which is released from RBCs during storage and upon transfusion.
In a study of human trauma patients, researchers found the amount of heme was proportional to the amount of blood transfused.
Brant M. Wagener, MD, PhD, of University of Alabama at Birmingham, and his colleagues detailed these findings in PLOS Medicine.
In the mouse model of trauma hemorrhage, the researchers resuscitated mice using either fresh blood (stored for 0 days) or blood stored for 2 weeks. (A 2-week storage of mouse blood approximates storage of human RBCs for 42 days.)
Two days after transfusion, the mice were challenged by instilling the lungs with the bacteria Pseudomonas aeruginosa.
Mice that received the stored blood had a significant increase in bacterial lung injury, as shown by higher mortality, and increased fluid accumulation and bacterial numbers in the lungs.
The researchers identified the connection between free heme and infection susceptibility/severity in 2 ways.
First, Pseudomonas aeruginosa-induced mortality was completely prevented by the addition of hemopexin, a scavenging protein that removes free heme from the blood.
Second, adding an inhibitor of toll-like receptor 4 (TLR4), or genetically removing TLR4 from mice, also prevented bacteria-induced mortality. Free heme—which is known to induce inflammatory injury to major organs in diseases like sickle cell or sepsis—acts, in part, by activating TLR4.
The researchers also found that transfusion with stored blood induced release of the inflammation mediator high mobility group box 1 (HMGB1). But an anti-HMGB1 antibody protected mice from bacteria-induced mortality.
The anti-HMGB1 antibody also restored macrophage-dependent phagocytosis of Pseudomonas aeruginosa in vitro.
Tissue culture experiments had revealed that free heme inhibits macrophages from ingesting Pseudomonas aeruginosa, and the addition of free heme increases permeability in endothelial cells.
Finally, in a 16-month study, the researchers found that human trauma-hemorrhage patients who received large amounts of transfused blood were also receiving amounts of free heme sufficient to overwhelm the normal amounts of hemopexin found in a person’s blood.
The researchers said this work underscores the need to confirm whether the storage age of transfused RBCs correlates with increasing levels of free heme after transfusion. The team would also like to establish whether patients with low ratios of hemopexin to free heme have a greater risk for adverse outcomes after massive transfusions.