User login
Switching from TDF- to TAF-Containing Antiretroviral Therapy: Impact on Bone Mineral Density in Older Patients Living With HIV
Study Overview
Objective. To evaluate the effect of changing from tenofovir disoproxil fumarate (TDF) –containing antiretroviral therapy (ART) to tenofovir alafenamide (TAF) –containing ART in patients ages 60 years and older living with HIV.
Design. Prospective, open-label, multicenter, randomized controlled trial.
Setting and participants. The study was completed across 36 European centers over 48 weeks. Patients were enrolled from December 12, 2015, to March 21, 2018, and were eligible to participate if they were diagnosed with HIV-1; virologically suppressed to < 50 copies/mL; on a TDF-containing ART regimen; and ≥ 60 years of age.
Intervention. Participants (n = 167) were randomly assigned in a 2:1 ratio to ART with TAF (10 mg), elvitegravir (EVG; 150 mg), cobicistat (COB; 150 mg), and emtricitabine (FTC; 200 mg) or to continued therapy with a TDF-containing ART regimen (300 mg TDF).
Main outcome measures. Primary outcome measures were the change in spine and hip bone mineral density from baseline at week 48. Secondary outcome measures included bone mineral density changes from baseline at week 24, HIV viral suppression and change in CD4 count at weeks 24 and 48, and the assessment of safety and tolerability of each ART regimen until week 48.
Main results. At 48 weeks, patients (n = 111) in the TAF+EVG+COB+FTC group had a mean 2.24% (SD, 3.27) increase in spine bone mineral density, while those in the TDF-containing group (n = 56) had a mean 0.10% decrease (SD, 3.39), a difference of 2.43% (95% confidence interval [CI], 1.34-3.52; P < 0.0001). In addition, at 48 weeks patients in the TAF+EVG+COB+FTC group had a mean 1.33% increase (SD, 2.20) in hip bone mineral density, as compared with a mean 0.73% decrease (SD, 3.21) in the TDF-containing group, a difference of 2.04% (95% CI, 1.17-2.90; P < 0.0001).
Similar results were seen in spine and hip bone mineral density in the TAF+EVG+COB+FTC group at week 24, with increases of 1.75% (P = 0.00080) and 1.35% (P = 0.00040), respectively. Both treatment groups maintained high virologic suppression. The TAF+EVG+COB+FTC group maintained 94.5% virologic suppression at week 24 and 93.6% at week 48, as compared with virologic suppression of 100% and 94.5% at weeks 24 and 48, respectively, in the TDF-containing group. However, the TAF+EVG+COB+FTC group had an increase in CD4 count from baseline (56 cells/µL), with no real change in the TDF-containing group (–1 cell/µL). Patients in the TAF+EVG+COB+FTC group had a mean 27.8 mg/g decrease in urine albumin-to-creatinine ratio (UACR) versus a 7.7 mg/g decrease in the TDF-containing group (P = 0.0042). In addition, patients in the TAF+EVG+COB+FTC group had a mean 49.8 mg/g decrease in urine protein-to-creatinine ratio (UPCR) versus a 3.8 mg/g decrease in the TDF-containing group (P = 0.0042).
Conclusion. Patients 60 years of age or older living with virologically suppressed HIV may benefit from improved bone mineral density by switching from a TDF-containing ART regimen to a TAF-containing regimen after 48 weeks, which, in turn, may help to reduce the risk for osteoporosis. Patients who were switched to a TAF-containing regimen also had favorable improvements in UACR and UPCR, which could indicate better renal function.
Commentary
The Centers for Disease Control and Prevention estimated that in 2018 nearly half of those living with HIV in the United States were older than 50 years.1 Today, the life expectancy of patients living with HIV on ART in developed countries is similar to that of patients not living with HIV. A meta-analysis published in 2017 estimated that patients diagnosed with HIV at age 20 beginning ART have a life expectancy of 63 years, and another study estimated that life expectancy in such patients is 89.1% of that of the general population in Canada.2,3 Overall, most people living with HIV infection are aging and at risk for medical conditions similar to persons without HIV disease. However, rates of osteoporosis in elderly patients with HIV are estimated to be 3 times greater than rates in persons without HIV.4 As a result, it is becoming increasingly important to find ways to decrease the risk of osteoporosis in these patients.
ART typically includes a nucleoside reverse transcriptase inhibitor (NRTI) combination and a third agent, such as an integrase strand inhibitor. Tenofovir is a commonly used backbone NRTI that comes in 2 forms, TDF (tenofovir disoproxil fumarate) and TAF (tenofovir alafenamide). Both are prodrugs that are converted to tenofovir diphosphate. TDF specifically is associated with an increased risk of bone loss and nephrotoxicity. The loss in bone mineral density is most similar to the bone loss seen with oral glucocorticoids.5 TDF has been shown to increase plasma levels of RANKL and tumor necrosis factor-α, leading to increased bone resorption.6 The long-term effects of TDF- versus TAF-containing ART on bone mineral density have, to our knowledge, not been compared previously in a randomized control study. The significance of demonstrating an increase in bone mineral density in the prevention of osteoporotic bone fracture in people living with HIV is less clear. A long-term cohort study completed in Japan looking at patients on TDF showed an increased risk of bone fractures in both older postmenopausal women and younger men.7 However, a retrospective cohort study looking at 1981 patients with HIV found no association between bone fractures and TDF.8
This randomized controlled trial used appropriate methods to measure the reported primary and secondary endpoints; however, it would be of benefit to continue following these patients to measure their true long-term risk of osteoporosis-related complications. In terms of the study’s secondary endpoints, it is notable that the patients maintained HIV viral suppression after the switch and CD4 counts remained stable (with a slight increase observed in the TAF-containing ART cohort).
In regard to the patient’s renal function, patients in the TAF group had significantly improved UACR and UPCR, which likely reflects improved glomerular filtration. Improved renal function is also increasingly important for patients with HIV, as up to 48.5% have some form of chronic kidney disease.9
Applications for Clinical Practice
This study shows that making the switch from TDF- to TAF-containing ART can lead to improved bone mineral density. We can extrapolate that switching may lead to a decreased risk of osteoporosis and osteoporosis-related complications, such as bone fracture, but this needs to be investigated in more detail. As demonstrated in this study, switching from a TDF- to a TAF-containing regimen can also lead to improved renal function while maintaining HIV viral suppression and CD4 counts.
Unfortunately, the regimen selected with TAF in this study (elvitegravir, cobicistat, and emtricitabine) includes cobicistat, which is no longer recommended as initial therapy due to its risk of drug-drug interactions, and elvitegravir, which has a lower barrier to resistance than other integrase strand inhibitors.10,11 The United States Department of Health and Human Services guidelines and the International Antiviral Society-USA Panel suggest using several other TAF-containing regimens for beginning or even switching therapy in older patients.10,11
When choosing between either a TAF- or a TDF-containing regimen to treat HIV infection in older patients, increasing evidence shows that using a TAF-containing ART regimen may be more beneficial for people living and aging with virologically suppressed HIV infection.
–Sean P. Bliven, and Norman L. Beatty, MD, University of Florida College of Medicine, Division of Infectious Diseases and Global Medicine, Gainesville, FL
1. Centers for Disease Control and Prevention. HIV among people aged 50 and over. 2018. https://www.cdc.gov/hiv/group/age/olderamericans/index.html. Accessed on November 22, 2019.
2. Teeraananchai S, Kerr S, Amin J, et al. Life expectancy of HIV-positive people after starting combination antiretroviral therapy: a meta-analysis. HIV Medicine. 2016;18:256-266.
3. Wandeler G, Johnson LF, Egger M. Trends in life expectancy of HIV-positive adults on antiretroviral therapy across the globe. Curr Opin HIV AIDS. 2016;11:492-500.
4. Brown TT, Qaqish RB. Antiretroviral therapy and the prevalence of osteopenia and osteoporosis: a meta-analytic review. AIDS. 2006;20:2165-2174.
5. Bolland MJ, Grey A, Reid IR. Skeletal health in adults with HIV infection. Lancet Diabetes Endocrinol. 2015;3:63-74.
6. Ofotokun I, Titanji K, Vunnava A, et al. Antiretroviral therapy induces a rapid increase in bone resorption that is positively associated with the magnitude of immune reconstitution in HIV infection. AIDS. 2016;30:405-414.
7. Komatsu A, Ikeda A, Kikuchi A, et al. Osteoporosis-related fractures in HIV-infected patients receiving long-term tenofovir disoproxil fumarate: an observational cohort study. Drug Saf. 2018;41:843-848.
8. Gediminas L, Wright EA, Dong Y, et al. Factors associated with fractures in HIV-infected persons: which factors matter? Osteoporos Int. 201728:239-244.
9. Naicker S, Rahmania, Kopp JB. HIV and chronic kidney disease. Clin Nephrol. 2015; 83(Suppl 1):S32-S38.
10. United States Department of Health and Human Services. Guidelines for the use of antiretroviral agents in adults and adolescents living with HIV. https://aidsinfo.nih.gov/guidelines/html/1/adult-and-adolescent-arv/0. Accessed December 10, 2019.
11. Saag MS, Benson CA, Gandhi RT, et al. Antiretroviral drugs for treatment and prevention of HIV infection in adults: 2018 recommendations of the International Antiviral Society-USA Panel. JAMA. 2018;320:379-396.
Study Overview
Objective. To evaluate the effect of changing from tenofovir disoproxil fumarate (TDF) –containing antiretroviral therapy (ART) to tenofovir alafenamide (TAF) –containing ART in patients ages 60 years and older living with HIV.
Design. Prospective, open-label, multicenter, randomized controlled trial.
Setting and participants. The study was completed across 36 European centers over 48 weeks. Patients were enrolled from December 12, 2015, to March 21, 2018, and were eligible to participate if they were diagnosed with HIV-1; virologically suppressed to < 50 copies/mL; on a TDF-containing ART regimen; and ≥ 60 years of age.
Intervention. Participants (n = 167) were randomly assigned in a 2:1 ratio to ART with TAF (10 mg), elvitegravir (EVG; 150 mg), cobicistat (COB; 150 mg), and emtricitabine (FTC; 200 mg) or to continued therapy with a TDF-containing ART regimen (300 mg TDF).
Main outcome measures. Primary outcome measures were the change in spine and hip bone mineral density from baseline at week 48. Secondary outcome measures included bone mineral density changes from baseline at week 24, HIV viral suppression and change in CD4 count at weeks 24 and 48, and the assessment of safety and tolerability of each ART regimen until week 48.
Main results. At 48 weeks, patients (n = 111) in the TAF+EVG+COB+FTC group had a mean 2.24% (SD, 3.27) increase in spine bone mineral density, while those in the TDF-containing group (n = 56) had a mean 0.10% decrease (SD, 3.39), a difference of 2.43% (95% confidence interval [CI], 1.34-3.52; P < 0.0001). In addition, at 48 weeks patients in the TAF+EVG+COB+FTC group had a mean 1.33% increase (SD, 2.20) in hip bone mineral density, as compared with a mean 0.73% decrease (SD, 3.21) in the TDF-containing group, a difference of 2.04% (95% CI, 1.17-2.90; P < 0.0001).
Similar results were seen in spine and hip bone mineral density in the TAF+EVG+COB+FTC group at week 24, with increases of 1.75% (P = 0.00080) and 1.35% (P = 0.00040), respectively. Both treatment groups maintained high virologic suppression. The TAF+EVG+COB+FTC group maintained 94.5% virologic suppression at week 24 and 93.6% at week 48, as compared with virologic suppression of 100% and 94.5% at weeks 24 and 48, respectively, in the TDF-containing group. However, the TAF+EVG+COB+FTC group had an increase in CD4 count from baseline (56 cells/µL), with no real change in the TDF-containing group (–1 cell/µL). Patients in the TAF+EVG+COB+FTC group had a mean 27.8 mg/g decrease in urine albumin-to-creatinine ratio (UACR) versus a 7.7 mg/g decrease in the TDF-containing group (P = 0.0042). In addition, patients in the TAF+EVG+COB+FTC group had a mean 49.8 mg/g decrease in urine protein-to-creatinine ratio (UPCR) versus a 3.8 mg/g decrease in the TDF-containing group (P = 0.0042).
Conclusion. Patients 60 years of age or older living with virologically suppressed HIV may benefit from improved bone mineral density by switching from a TDF-containing ART regimen to a TAF-containing regimen after 48 weeks, which, in turn, may help to reduce the risk for osteoporosis. Patients who were switched to a TAF-containing regimen also had favorable improvements in UACR and UPCR, which could indicate better renal function.
Commentary
The Centers for Disease Control and Prevention estimated that in 2018 nearly half of those living with HIV in the United States were older than 50 years.1 Today, the life expectancy of patients living with HIV on ART in developed countries is similar to that of patients not living with HIV. A meta-analysis published in 2017 estimated that patients diagnosed with HIV at age 20 beginning ART have a life expectancy of 63 years, and another study estimated that life expectancy in such patients is 89.1% of that of the general population in Canada.2,3 Overall, most people living with HIV infection are aging and at risk for medical conditions similar to persons without HIV disease. However, rates of osteoporosis in elderly patients with HIV are estimated to be 3 times greater than rates in persons without HIV.4 As a result, it is becoming increasingly important to find ways to decrease the risk of osteoporosis in these patients.
ART typically includes a nucleoside reverse transcriptase inhibitor (NRTI) combination and a third agent, such as an integrase strand inhibitor. Tenofovir is a commonly used backbone NRTI that comes in 2 forms, TDF (tenofovir disoproxil fumarate) and TAF (tenofovir alafenamide). Both are prodrugs that are converted to tenofovir diphosphate. TDF specifically is associated with an increased risk of bone loss and nephrotoxicity. The loss in bone mineral density is most similar to the bone loss seen with oral glucocorticoids.5 TDF has been shown to increase plasma levels of RANKL and tumor necrosis factor-α, leading to increased bone resorption.6 The long-term effects of TDF- versus TAF-containing ART on bone mineral density have, to our knowledge, not been compared previously in a randomized control study. The significance of demonstrating an increase in bone mineral density in the prevention of osteoporotic bone fracture in people living with HIV is less clear. A long-term cohort study completed in Japan looking at patients on TDF showed an increased risk of bone fractures in both older postmenopausal women and younger men.7 However, a retrospective cohort study looking at 1981 patients with HIV found no association between bone fractures and TDF.8
This randomized controlled trial used appropriate methods to measure the reported primary and secondary endpoints; however, it would be of benefit to continue following these patients to measure their true long-term risk of osteoporosis-related complications. In terms of the study’s secondary endpoints, it is notable that the patients maintained HIV viral suppression after the switch and CD4 counts remained stable (with a slight increase observed in the TAF-containing ART cohort).
In regard to the patient’s renal function, patients in the TAF group had significantly improved UACR and UPCR, which likely reflects improved glomerular filtration. Improved renal function is also increasingly important for patients with HIV, as up to 48.5% have some form of chronic kidney disease.9
Applications for Clinical Practice
This study shows that making the switch from TDF- to TAF-containing ART can lead to improved bone mineral density. We can extrapolate that switching may lead to a decreased risk of osteoporosis and osteoporosis-related complications, such as bone fracture, but this needs to be investigated in more detail. As demonstrated in this study, switching from a TDF- to a TAF-containing regimen can also lead to improved renal function while maintaining HIV viral suppression and CD4 counts.
Unfortunately, the regimen selected with TAF in this study (elvitegravir, cobicistat, and emtricitabine) includes cobicistat, which is no longer recommended as initial therapy due to its risk of drug-drug interactions, and elvitegravir, which has a lower barrier to resistance than other integrase strand inhibitors.10,11 The United States Department of Health and Human Services guidelines and the International Antiviral Society-USA Panel suggest using several other TAF-containing regimens for beginning or even switching therapy in older patients.10,11
When choosing between either a TAF- or a TDF-containing regimen to treat HIV infection in older patients, increasing evidence shows that using a TAF-containing ART regimen may be more beneficial for people living and aging with virologically suppressed HIV infection.
–Sean P. Bliven, and Norman L. Beatty, MD, University of Florida College of Medicine, Division of Infectious Diseases and Global Medicine, Gainesville, FL
Study Overview
Objective. To evaluate the effect of changing from tenofovir disoproxil fumarate (TDF) –containing antiretroviral therapy (ART) to tenofovir alafenamide (TAF) –containing ART in patients ages 60 years and older living with HIV.
Design. Prospective, open-label, multicenter, randomized controlled trial.
Setting and participants. The study was completed across 36 European centers over 48 weeks. Patients were enrolled from December 12, 2015, to March 21, 2018, and were eligible to participate if they were diagnosed with HIV-1; virologically suppressed to < 50 copies/mL; on a TDF-containing ART regimen; and ≥ 60 years of age.
Intervention. Participants (n = 167) were randomly assigned in a 2:1 ratio to ART with TAF (10 mg), elvitegravir (EVG; 150 mg), cobicistat (COB; 150 mg), and emtricitabine (FTC; 200 mg) or to continued therapy with a TDF-containing ART regimen (300 mg TDF).
Main outcome measures. Primary outcome measures were the change in spine and hip bone mineral density from baseline at week 48. Secondary outcome measures included bone mineral density changes from baseline at week 24, HIV viral suppression and change in CD4 count at weeks 24 and 48, and the assessment of safety and tolerability of each ART regimen until week 48.
Main results. At 48 weeks, patients (n = 111) in the TAF+EVG+COB+FTC group had a mean 2.24% (SD, 3.27) increase in spine bone mineral density, while those in the TDF-containing group (n = 56) had a mean 0.10% decrease (SD, 3.39), a difference of 2.43% (95% confidence interval [CI], 1.34-3.52; P < 0.0001). In addition, at 48 weeks patients in the TAF+EVG+COB+FTC group had a mean 1.33% increase (SD, 2.20) in hip bone mineral density, as compared with a mean 0.73% decrease (SD, 3.21) in the TDF-containing group, a difference of 2.04% (95% CI, 1.17-2.90; P < 0.0001).
Similar results were seen in spine and hip bone mineral density in the TAF+EVG+COB+FTC group at week 24, with increases of 1.75% (P = 0.00080) and 1.35% (P = 0.00040), respectively. Both treatment groups maintained high virologic suppression. The TAF+EVG+COB+FTC group maintained 94.5% virologic suppression at week 24 and 93.6% at week 48, as compared with virologic suppression of 100% and 94.5% at weeks 24 and 48, respectively, in the TDF-containing group. However, the TAF+EVG+COB+FTC group had an increase in CD4 count from baseline (56 cells/µL), with no real change in the TDF-containing group (–1 cell/µL). Patients in the TAF+EVG+COB+FTC group had a mean 27.8 mg/g decrease in urine albumin-to-creatinine ratio (UACR) versus a 7.7 mg/g decrease in the TDF-containing group (P = 0.0042). In addition, patients in the TAF+EVG+COB+FTC group had a mean 49.8 mg/g decrease in urine protein-to-creatinine ratio (UPCR) versus a 3.8 mg/g decrease in the TDF-containing group (P = 0.0042).
Conclusion. Patients 60 years of age or older living with virologically suppressed HIV may benefit from improved bone mineral density by switching from a TDF-containing ART regimen to a TAF-containing regimen after 48 weeks, which, in turn, may help to reduce the risk for osteoporosis. Patients who were switched to a TAF-containing regimen also had favorable improvements in UACR and UPCR, which could indicate better renal function.
Commentary
The Centers for Disease Control and Prevention estimated that in 2018 nearly half of those living with HIV in the United States were older than 50 years.1 Today, the life expectancy of patients living with HIV on ART in developed countries is similar to that of patients not living with HIV. A meta-analysis published in 2017 estimated that patients diagnosed with HIV at age 20 beginning ART have a life expectancy of 63 years, and another study estimated that life expectancy in such patients is 89.1% of that of the general population in Canada.2,3 Overall, most people living with HIV infection are aging and at risk for medical conditions similar to persons without HIV disease. However, rates of osteoporosis in elderly patients with HIV are estimated to be 3 times greater than rates in persons without HIV.4 As a result, it is becoming increasingly important to find ways to decrease the risk of osteoporosis in these patients.
ART typically includes a nucleoside reverse transcriptase inhibitor (NRTI) combination and a third agent, such as an integrase strand inhibitor. Tenofovir is a commonly used backbone NRTI that comes in 2 forms, TDF (tenofovir disoproxil fumarate) and TAF (tenofovir alafenamide). Both are prodrugs that are converted to tenofovir diphosphate. TDF specifically is associated with an increased risk of bone loss and nephrotoxicity. The loss in bone mineral density is most similar to the bone loss seen with oral glucocorticoids.5 TDF has been shown to increase plasma levels of RANKL and tumor necrosis factor-α, leading to increased bone resorption.6 The long-term effects of TDF- versus TAF-containing ART on bone mineral density have, to our knowledge, not been compared previously in a randomized control study. The significance of demonstrating an increase in bone mineral density in the prevention of osteoporotic bone fracture in people living with HIV is less clear. A long-term cohort study completed in Japan looking at patients on TDF showed an increased risk of bone fractures in both older postmenopausal women and younger men.7 However, a retrospective cohort study looking at 1981 patients with HIV found no association between bone fractures and TDF.8
This randomized controlled trial used appropriate methods to measure the reported primary and secondary endpoints; however, it would be of benefit to continue following these patients to measure their true long-term risk of osteoporosis-related complications. In terms of the study’s secondary endpoints, it is notable that the patients maintained HIV viral suppression after the switch and CD4 counts remained stable (with a slight increase observed in the TAF-containing ART cohort).
In regard to the patient’s renal function, patients in the TAF group had significantly improved UACR and UPCR, which likely reflects improved glomerular filtration. Improved renal function is also increasingly important for patients with HIV, as up to 48.5% have some form of chronic kidney disease.9
Applications for Clinical Practice
This study shows that making the switch from TDF- to TAF-containing ART can lead to improved bone mineral density. We can extrapolate that switching may lead to a decreased risk of osteoporosis and osteoporosis-related complications, such as bone fracture, but this needs to be investigated in more detail. As demonstrated in this study, switching from a TDF- to a TAF-containing regimen can also lead to improved renal function while maintaining HIV viral suppression and CD4 counts.
Unfortunately, the regimen selected with TAF in this study (elvitegravir, cobicistat, and emtricitabine) includes cobicistat, which is no longer recommended as initial therapy due to its risk of drug-drug interactions, and elvitegravir, which has a lower barrier to resistance than other integrase strand inhibitors.10,11 The United States Department of Health and Human Services guidelines and the International Antiviral Society-USA Panel suggest using several other TAF-containing regimens for beginning or even switching therapy in older patients.10,11
When choosing between either a TAF- or a TDF-containing regimen to treat HIV infection in older patients, increasing evidence shows that using a TAF-containing ART regimen may be more beneficial for people living and aging with virologically suppressed HIV infection.
–Sean P. Bliven, and Norman L. Beatty, MD, University of Florida College of Medicine, Division of Infectious Diseases and Global Medicine, Gainesville, FL
1. Centers for Disease Control and Prevention. HIV among people aged 50 and over. 2018. https://www.cdc.gov/hiv/group/age/olderamericans/index.html. Accessed on November 22, 2019.
2. Teeraananchai S, Kerr S, Amin J, et al. Life expectancy of HIV-positive people after starting combination antiretroviral therapy: a meta-analysis. HIV Medicine. 2016;18:256-266.
3. Wandeler G, Johnson LF, Egger M. Trends in life expectancy of HIV-positive adults on antiretroviral therapy across the globe. Curr Opin HIV AIDS. 2016;11:492-500.
4. Brown TT, Qaqish RB. Antiretroviral therapy and the prevalence of osteopenia and osteoporosis: a meta-analytic review. AIDS. 2006;20:2165-2174.
5. Bolland MJ, Grey A, Reid IR. Skeletal health in adults with HIV infection. Lancet Diabetes Endocrinol. 2015;3:63-74.
6. Ofotokun I, Titanji K, Vunnava A, et al. Antiretroviral therapy induces a rapid increase in bone resorption that is positively associated with the magnitude of immune reconstitution in HIV infection. AIDS. 2016;30:405-414.
7. Komatsu A, Ikeda A, Kikuchi A, et al. Osteoporosis-related fractures in HIV-infected patients receiving long-term tenofovir disoproxil fumarate: an observational cohort study. Drug Saf. 2018;41:843-848.
8. Gediminas L, Wright EA, Dong Y, et al. Factors associated with fractures in HIV-infected persons: which factors matter? Osteoporos Int. 201728:239-244.
9. Naicker S, Rahmania, Kopp JB. HIV and chronic kidney disease. Clin Nephrol. 2015; 83(Suppl 1):S32-S38.
10. United States Department of Health and Human Services. Guidelines for the use of antiretroviral agents in adults and adolescents living with HIV. https://aidsinfo.nih.gov/guidelines/html/1/adult-and-adolescent-arv/0. Accessed December 10, 2019.
11. Saag MS, Benson CA, Gandhi RT, et al. Antiretroviral drugs for treatment and prevention of HIV infection in adults: 2018 recommendations of the International Antiviral Society-USA Panel. JAMA. 2018;320:379-396.
1. Centers for Disease Control and Prevention. HIV among people aged 50 and over. 2018. https://www.cdc.gov/hiv/group/age/olderamericans/index.html. Accessed on November 22, 2019.
2. Teeraananchai S, Kerr S, Amin J, et al. Life expectancy of HIV-positive people after starting combination antiretroviral therapy: a meta-analysis. HIV Medicine. 2016;18:256-266.
3. Wandeler G, Johnson LF, Egger M. Trends in life expectancy of HIV-positive adults on antiretroviral therapy across the globe. Curr Opin HIV AIDS. 2016;11:492-500.
4. Brown TT, Qaqish RB. Antiretroviral therapy and the prevalence of osteopenia and osteoporosis: a meta-analytic review. AIDS. 2006;20:2165-2174.
5. Bolland MJ, Grey A, Reid IR. Skeletal health in adults with HIV infection. Lancet Diabetes Endocrinol. 2015;3:63-74.
6. Ofotokun I, Titanji K, Vunnava A, et al. Antiretroviral therapy induces a rapid increase in bone resorption that is positively associated with the magnitude of immune reconstitution in HIV infection. AIDS. 2016;30:405-414.
7. Komatsu A, Ikeda A, Kikuchi A, et al. Osteoporosis-related fractures in HIV-infected patients receiving long-term tenofovir disoproxil fumarate: an observational cohort study. Drug Saf. 2018;41:843-848.
8. Gediminas L, Wright EA, Dong Y, et al. Factors associated with fractures in HIV-infected persons: which factors matter? Osteoporos Int. 201728:239-244.
9. Naicker S, Rahmania, Kopp JB. HIV and chronic kidney disease. Clin Nephrol. 2015; 83(Suppl 1):S32-S38.
10. United States Department of Health and Human Services. Guidelines for the use of antiretroviral agents in adults and adolescents living with HIV. https://aidsinfo.nih.gov/guidelines/html/1/adult-and-adolescent-arv/0. Accessed December 10, 2019.
11. Saag MS, Benson CA, Gandhi RT, et al. Antiretroviral drugs for treatment and prevention of HIV infection in adults: 2018 recommendations of the International Antiviral Society-USA Panel. JAMA. 2018;320:379-396.
Nonculprit Lesion PCI Strategies in Patients With STEMI Without Cardiogenic Shock
Study Overview
Objective. To determine whether percutaneous coronary intervention (PCI) of a nonculprit lesion in patients with ST-segment elevation myocardial infarction (STEMI) reduces the risk of cardiovascular death or myocardial infarction.
Design. International, multicenter, randomized controlled trial blinded to outcome.
Setting and participants. Patients with STEMI who had multivessel coronary disease and had undergone successful PCI to the culprit lesion.
Intervention. A total of 4041 patients were randomly assigned to either PCI of angiographically significant nonculprit lesions or optimal medical therapy without further revascularization. Randomization was stratified according to intended timing of nonculprit lesion PCI (either during or after the index hospitalization).
Main outcome measures. The first co-primary endpoint was the composite of cardiovascular death or myocardial infarction (MI). The second co-primary endpoint was the composite of cardiovascular death, MI or ischemia-driven revascularization.
Main results. At a median follow-up of 3 years, the composite of cardiovascular death or MI occurred in 158 of the 2016 patients (7.8%) in the nonculprit PCI group and in 213 of the 2025 patients (10.5%) in the culprit-lesion-only group (hazard ratio, 0.73; 95% confidence interval [CI], 0.60-0.91; P = 0.004). The second co-primary endpoint occurred in 179 patients (8.9%) in the nonculprit PCI group and in 339 patients (16.7%) in the culprit-lesion-only group (hazard ratio, 0.51; 95% CI, 0.43-0.61; P < 0.001).
Conclusion. Among patients with STEMI and multivessel disease, those who underwent complete revascularization with nonculprit lesion PCI had lower rates of cardiovascular death or MI compared to patients with culprit-lesion-only revascularization.
Commentary
Patients presenting with STEMI often have multivessel disease.1 Although it is known that mortality can be reduced by early revascularization of the culprit vessel,2 whether the nonculprit vessel should be revascularized at the time of presentation with STEMI remains controversial.
Recently, multiple studies have reported the benefit of nonculprit vessel revascularization in patients presenting with hemodynamically stable STEMI. Four trials (PRAMI, CvPRIT, DANAMI-PRIMULTI, and COMPARE ACUTE) investigated this clinical question with different designs, and all reported benefit of nonculprit vessel revascularization compared to a culprit-only strategy.3-6 However, the differences in the composite endpoints were mainly driven by the softer endpoints used in these trials, such as refractory angina and ischemia-driven revascularization, and none of these previous trials had adequate power to evaluate differences in hard outcomes, such as death or MI.
In this context, Mehta et al investigated whether achieving complete revascularization by performing PCI on nonculprit vessels would improve the composite of cardiovascular death or MI compared to the culprit-only strategy by conducting a well-designed randomized controlled study. At median follow-up of 3 years, patients who underwent nonculprit vessel PCI had a lower incidence of death or MI compared to those who received the culprit-only strategy (7.8% versus 10.5%). The second co-primary endpoint (composite of death, MI, or ischemia-driven revascularization) also occurred significantly less frequently in the nonculprit PCI group than in the culprit-only PCI group (8.9% versus 16.7%).
The current study has a number of strengths. First, this was a multicenter, international study, and a large number of patients were enrolled (> 4000), achieving adequate power to evaluate for the composite of death and MI. Second, the treatments the patients received reflect contemporary medical therapy and interventional practice: the third-generation thienopyridine ticagrelor, high-dose statins, and ACE inhibitors were prescribed at high rates, and radial access (> 80%) and current-generation drug-eluting stents were used at high rates as well. Third, all angiograms were reviewed by the core lab to evaluate for completeness of revascularization. Fourth, the trial mandated use of fractional flow reserve to assess lesion stenosis 50% to 69% before considering revascularization, ensuring that only ischemic or very-high-grade lesions were revascularized. Fifth, the crossover rate in each group was low compared to the previous studies (4.7% into the complete revascularization group, 3.9% into the lesion-only group). Finally, this study evaluated the timing of the nonculprit PCI. Randomization to each group was stratified according to the intended timing of the nonculprit PCI during the index hospitalization or after hospital discharge (within 45 days). They found that benefit was consistent regardless of when the nonculprit PCI was performed.
Although the COMPLETE study’s design has a number of strengths, it is important to note that patients enrolled in this trial represent a lower-risk STEMI population. Patients with complex anatomy likely were not included, as evidenced by a lower SYNTAX score (mean, 16). Furthermore, no patients who presented with STEMI complicated by cardiogenic shock were enrolled. In the recent CULPRIT SHOCK trial, which focused on patients who had multivessel disease, acute MI, and cardiogenic shock, patients who underwent the culprit-only strategy had a lower rate of death or renal replacement therapy, as compared to patients who underwent immediate complete revascularization.7 Therefore, whether the findings from the COMPLETE study can be extended to a sicker population requires further study.
In 2015, the results from the previous trials, such as PRAMI and CvPRIT, led to a focused update of US PCI guidelines.8 Recommendations for noninfarct-related artery PCI in hemodynamically stable patients presenting with acute MI were upgraded from class III to class IIb. The results from the COMPLETE trial will likely influence the future guidelines, with stronger recommendations toward complete revascularization in patients presenting with hemodynamically stable STEMI.
Applications for Clinical Practice
In patients presenting with hemodynamically stable STEMI, staged complete revascularization, including the nonculprit vessel, should be considered.
– Taishi Hirai, MD, University of Missouri, Columbia, MO, and John EA Blair, MD, University of Chicago Medical Center, Chicago, IL
1. Park DW, Clare RM, Schulte PJ, et al. Extent, location, and clinical significance of non-infarct-related coronary artery disease among patients with ST-elevation myocardial infarction. JAMA. 2014;312:2019-2027.
2. Hochman JS, Sleeper LA, Webb JG, et al. Early revascularization in acute myocardial infarction complicated by cardiogenic shock. SHOCK Investigators. Should We Emergently Revascularize Occluded Coronaries for Cardiogenic Shock. N Engl J Med. 1999;341:625-634.
3. Wald DS, Morris JK, Wald NJ, et al. Randomized trial of preventive angioplasty in myocardial infarction. N Engl J Med. 2013;369:1115-1123.
4. Gershlick AH, Khan JN, Kelly DJ, et al. Randomized trial of complete versus lesion-only revascularization in patients undergoing primary percutaneous coronary intervention for STEMI and multivessel disease: the CvLPRIT trial. J Am Coll Cardiol. 2015;65:963-972.
5. Engstrom T, Kelbaek H, Helqvist S, et al. Complete revascularisation versus treatment of the culprit lesion only in patients with ST-segment elevation myocardial infarction and multivessel disease (DANAMI-3-PRIMULTI): an open-label, randomised controlled trial. Lancet. 2015;386:665-671.
6. Smits PC, Abdel-Wahab M, Neumann FJ, et al. Fractional flow reserve-guided multivessel angioplasty in myocardial infarction. N Engl J Med. 2017;376:1234-1244.
7. Thiele H, Akin I, Sandri M, et al. PCI strategies in patients with acute myocardial infarction and cardiogenic shock. N Engl J Med. 2017;377:2419-2432.
8. Levine GN, Bates ER, Blankenship JC, et al. 2015 ACC/AHA/SCAI focused update on primary percutaneous coronary intervention for patients with ST-elevation myocardial infarction: an update of the 2011 ACCF/AHA/SCAI guideline for percutaneous coronary intervention and the 2013 ACCF/AHA guideline for the management of ST-elevation myocardial infarction. J Am Coll Cardiol. 2016;67:1235-1250.
Study Overview
Objective. To determine whether percutaneous coronary intervention (PCI) of a nonculprit lesion in patients with ST-segment elevation myocardial infarction (STEMI) reduces the risk of cardiovascular death or myocardial infarction.
Design. International, multicenter, randomized controlled trial blinded to outcome.
Setting and participants. Patients with STEMI who had multivessel coronary disease and had undergone successful PCI to the culprit lesion.
Intervention. A total of 4041 patients were randomly assigned to either PCI of angiographically significant nonculprit lesions or optimal medical therapy without further revascularization. Randomization was stratified according to intended timing of nonculprit lesion PCI (either during or after the index hospitalization).
Main outcome measures. The first co-primary endpoint was the composite of cardiovascular death or myocardial infarction (MI). The second co-primary endpoint was the composite of cardiovascular death, MI or ischemia-driven revascularization.
Main results. At a median follow-up of 3 years, the composite of cardiovascular death or MI occurred in 158 of the 2016 patients (7.8%) in the nonculprit PCI group and in 213 of the 2025 patients (10.5%) in the culprit-lesion-only group (hazard ratio, 0.73; 95% confidence interval [CI], 0.60-0.91; P = 0.004). The second co-primary endpoint occurred in 179 patients (8.9%) in the nonculprit PCI group and in 339 patients (16.7%) in the culprit-lesion-only group (hazard ratio, 0.51; 95% CI, 0.43-0.61; P < 0.001).
Conclusion. Among patients with STEMI and multivessel disease, those who underwent complete revascularization with nonculprit lesion PCI had lower rates of cardiovascular death or MI compared to patients with culprit-lesion-only revascularization.
Commentary
Patients presenting with STEMI often have multivessel disease.1 Although it is known that mortality can be reduced by early revascularization of the culprit vessel,2 whether the nonculprit vessel should be revascularized at the time of presentation with STEMI remains controversial.
Recently, multiple studies have reported the benefit of nonculprit vessel revascularization in patients presenting with hemodynamically stable STEMI. Four trials (PRAMI, CvPRIT, DANAMI-PRIMULTI, and COMPARE ACUTE) investigated this clinical question with different designs, and all reported benefit of nonculprit vessel revascularization compared to a culprit-only strategy.3-6 However, the differences in the composite endpoints were mainly driven by the softer endpoints used in these trials, such as refractory angina and ischemia-driven revascularization, and none of these previous trials had adequate power to evaluate differences in hard outcomes, such as death or MI.
In this context, Mehta et al investigated whether achieving complete revascularization by performing PCI on nonculprit vessels would improve the composite of cardiovascular death or MI compared to the culprit-only strategy by conducting a well-designed randomized controlled study. At median follow-up of 3 years, patients who underwent nonculprit vessel PCI had a lower incidence of death or MI compared to those who received the culprit-only strategy (7.8% versus 10.5%). The second co-primary endpoint (composite of death, MI, or ischemia-driven revascularization) also occurred significantly less frequently in the nonculprit PCI group than in the culprit-only PCI group (8.9% versus 16.7%).
The current study has a number of strengths. First, this was a multicenter, international study, and a large number of patients were enrolled (> 4000), achieving adequate power to evaluate for the composite of death and MI. Second, the treatments the patients received reflect contemporary medical therapy and interventional practice: the third-generation thienopyridine ticagrelor, high-dose statins, and ACE inhibitors were prescribed at high rates, and radial access (> 80%) and current-generation drug-eluting stents were used at high rates as well. Third, all angiograms were reviewed by the core lab to evaluate for completeness of revascularization. Fourth, the trial mandated use of fractional flow reserve to assess lesion stenosis 50% to 69% before considering revascularization, ensuring that only ischemic or very-high-grade lesions were revascularized. Fifth, the crossover rate in each group was low compared to the previous studies (4.7% into the complete revascularization group, 3.9% into the lesion-only group). Finally, this study evaluated the timing of the nonculprit PCI. Randomization to each group was stratified according to the intended timing of the nonculprit PCI during the index hospitalization or after hospital discharge (within 45 days). They found that benefit was consistent regardless of when the nonculprit PCI was performed.
Although the COMPLETE study’s design has a number of strengths, it is important to note that patients enrolled in this trial represent a lower-risk STEMI population. Patients with complex anatomy likely were not included, as evidenced by a lower SYNTAX score (mean, 16). Furthermore, no patients who presented with STEMI complicated by cardiogenic shock were enrolled. In the recent CULPRIT SHOCK trial, which focused on patients who had multivessel disease, acute MI, and cardiogenic shock, patients who underwent the culprit-only strategy had a lower rate of death or renal replacement therapy, as compared to patients who underwent immediate complete revascularization.7 Therefore, whether the findings from the COMPLETE study can be extended to a sicker population requires further study.
In 2015, the results from the previous trials, such as PRAMI and CvPRIT, led to a focused update of US PCI guidelines.8 Recommendations for noninfarct-related artery PCI in hemodynamically stable patients presenting with acute MI were upgraded from class III to class IIb. The results from the COMPLETE trial will likely influence the future guidelines, with stronger recommendations toward complete revascularization in patients presenting with hemodynamically stable STEMI.
Applications for Clinical Practice
In patients presenting with hemodynamically stable STEMI, staged complete revascularization, including the nonculprit vessel, should be considered.
– Taishi Hirai, MD, University of Missouri, Columbia, MO, and John EA Blair, MD, University of Chicago Medical Center, Chicago, IL
Study Overview
Objective. To determine whether percutaneous coronary intervention (PCI) of a nonculprit lesion in patients with ST-segment elevation myocardial infarction (STEMI) reduces the risk of cardiovascular death or myocardial infarction.
Design. International, multicenter, randomized controlled trial blinded to outcome.
Setting and participants. Patients with STEMI who had multivessel coronary disease and had undergone successful PCI to the culprit lesion.
Intervention. A total of 4041 patients were randomly assigned to either PCI of angiographically significant nonculprit lesions or optimal medical therapy without further revascularization. Randomization was stratified according to intended timing of nonculprit lesion PCI (either during or after the index hospitalization).
Main outcome measures. The first co-primary endpoint was the composite of cardiovascular death or myocardial infarction (MI). The second co-primary endpoint was the composite of cardiovascular death, MI or ischemia-driven revascularization.
Main results. At a median follow-up of 3 years, the composite of cardiovascular death or MI occurred in 158 of the 2016 patients (7.8%) in the nonculprit PCI group and in 213 of the 2025 patients (10.5%) in the culprit-lesion-only group (hazard ratio, 0.73; 95% confidence interval [CI], 0.60-0.91; P = 0.004). The second co-primary endpoint occurred in 179 patients (8.9%) in the nonculprit PCI group and in 339 patients (16.7%) in the culprit-lesion-only group (hazard ratio, 0.51; 95% CI, 0.43-0.61; P < 0.001).
Conclusion. Among patients with STEMI and multivessel disease, those who underwent complete revascularization with nonculprit lesion PCI had lower rates of cardiovascular death or MI compared to patients with culprit-lesion-only revascularization.
Commentary
Patients presenting with STEMI often have multivessel disease.1 Although it is known that mortality can be reduced by early revascularization of the culprit vessel,2 whether the nonculprit vessel should be revascularized at the time of presentation with STEMI remains controversial.
Recently, multiple studies have reported the benefit of nonculprit vessel revascularization in patients presenting with hemodynamically stable STEMI. Four trials (PRAMI, CvPRIT, DANAMI-PRIMULTI, and COMPARE ACUTE) investigated this clinical question with different designs, and all reported benefit of nonculprit vessel revascularization compared to a culprit-only strategy.3-6 However, the differences in the composite endpoints were mainly driven by the softer endpoints used in these trials, such as refractory angina and ischemia-driven revascularization, and none of these previous trials had adequate power to evaluate differences in hard outcomes, such as death or MI.
In this context, Mehta et al investigated whether achieving complete revascularization by performing PCI on nonculprit vessels would improve the composite of cardiovascular death or MI compared to the culprit-only strategy by conducting a well-designed randomized controlled study. At median follow-up of 3 years, patients who underwent nonculprit vessel PCI had a lower incidence of death or MI compared to those who received the culprit-only strategy (7.8% versus 10.5%). The second co-primary endpoint (composite of death, MI, or ischemia-driven revascularization) also occurred significantly less frequently in the nonculprit PCI group than in the culprit-only PCI group (8.9% versus 16.7%).
The current study has a number of strengths. First, this was a multicenter, international study, and a large number of patients were enrolled (> 4000), achieving adequate power to evaluate for the composite of death and MI. Second, the treatments the patients received reflect contemporary medical therapy and interventional practice: the third-generation thienopyridine ticagrelor, high-dose statins, and ACE inhibitors were prescribed at high rates, and radial access (> 80%) and current-generation drug-eluting stents were used at high rates as well. Third, all angiograms were reviewed by the core lab to evaluate for completeness of revascularization. Fourth, the trial mandated use of fractional flow reserve to assess lesion stenosis 50% to 69% before considering revascularization, ensuring that only ischemic or very-high-grade lesions were revascularized. Fifth, the crossover rate in each group was low compared to the previous studies (4.7% into the complete revascularization group, 3.9% into the lesion-only group). Finally, this study evaluated the timing of the nonculprit PCI. Randomization to each group was stratified according to the intended timing of the nonculprit PCI during the index hospitalization or after hospital discharge (within 45 days). They found that benefit was consistent regardless of when the nonculprit PCI was performed.
Although the COMPLETE study’s design has a number of strengths, it is important to note that patients enrolled in this trial represent a lower-risk STEMI population. Patients with complex anatomy likely were not included, as evidenced by a lower SYNTAX score (mean, 16). Furthermore, no patients who presented with STEMI complicated by cardiogenic shock were enrolled. In the recent CULPRIT SHOCK trial, which focused on patients who had multivessel disease, acute MI, and cardiogenic shock, patients who underwent the culprit-only strategy had a lower rate of death or renal replacement therapy, as compared to patients who underwent immediate complete revascularization.7 Therefore, whether the findings from the COMPLETE study can be extended to a sicker population requires further study.
In 2015, the results from the previous trials, such as PRAMI and CvPRIT, led to a focused update of US PCI guidelines.8 Recommendations for noninfarct-related artery PCI in hemodynamically stable patients presenting with acute MI were upgraded from class III to class IIb. The results from the COMPLETE trial will likely influence the future guidelines, with stronger recommendations toward complete revascularization in patients presenting with hemodynamically stable STEMI.
Applications for Clinical Practice
In patients presenting with hemodynamically stable STEMI, staged complete revascularization, including the nonculprit vessel, should be considered.
– Taishi Hirai, MD, University of Missouri, Columbia, MO, and John EA Blair, MD, University of Chicago Medical Center, Chicago, IL
1. Park DW, Clare RM, Schulte PJ, et al. Extent, location, and clinical significance of non-infarct-related coronary artery disease among patients with ST-elevation myocardial infarction. JAMA. 2014;312:2019-2027.
2. Hochman JS, Sleeper LA, Webb JG, et al. Early revascularization in acute myocardial infarction complicated by cardiogenic shock. SHOCK Investigators. Should We Emergently Revascularize Occluded Coronaries for Cardiogenic Shock. N Engl J Med. 1999;341:625-634.
3. Wald DS, Morris JK, Wald NJ, et al. Randomized trial of preventive angioplasty in myocardial infarction. N Engl J Med. 2013;369:1115-1123.
4. Gershlick AH, Khan JN, Kelly DJ, et al. Randomized trial of complete versus lesion-only revascularization in patients undergoing primary percutaneous coronary intervention for STEMI and multivessel disease: the CvLPRIT trial. J Am Coll Cardiol. 2015;65:963-972.
5. Engstrom T, Kelbaek H, Helqvist S, et al. Complete revascularisation versus treatment of the culprit lesion only in patients with ST-segment elevation myocardial infarction and multivessel disease (DANAMI-3-PRIMULTI): an open-label, randomised controlled trial. Lancet. 2015;386:665-671.
6. Smits PC, Abdel-Wahab M, Neumann FJ, et al. Fractional flow reserve-guided multivessel angioplasty in myocardial infarction. N Engl J Med. 2017;376:1234-1244.
7. Thiele H, Akin I, Sandri M, et al. PCI strategies in patients with acute myocardial infarction and cardiogenic shock. N Engl J Med. 2017;377:2419-2432.
8. Levine GN, Bates ER, Blankenship JC, et al. 2015 ACC/AHA/SCAI focused update on primary percutaneous coronary intervention for patients with ST-elevation myocardial infarction: an update of the 2011 ACCF/AHA/SCAI guideline for percutaneous coronary intervention and the 2013 ACCF/AHA guideline for the management of ST-elevation myocardial infarction. J Am Coll Cardiol. 2016;67:1235-1250.
1. Park DW, Clare RM, Schulte PJ, et al. Extent, location, and clinical significance of non-infarct-related coronary artery disease among patients with ST-elevation myocardial infarction. JAMA. 2014;312:2019-2027.
2. Hochman JS, Sleeper LA, Webb JG, et al. Early revascularization in acute myocardial infarction complicated by cardiogenic shock. SHOCK Investigators. Should We Emergently Revascularize Occluded Coronaries for Cardiogenic Shock. N Engl J Med. 1999;341:625-634.
3. Wald DS, Morris JK, Wald NJ, et al. Randomized trial of preventive angioplasty in myocardial infarction. N Engl J Med. 2013;369:1115-1123.
4. Gershlick AH, Khan JN, Kelly DJ, et al. Randomized trial of complete versus lesion-only revascularization in patients undergoing primary percutaneous coronary intervention for STEMI and multivessel disease: the CvLPRIT trial. J Am Coll Cardiol. 2015;65:963-972.
5. Engstrom T, Kelbaek H, Helqvist S, et al. Complete revascularisation versus treatment of the culprit lesion only in patients with ST-segment elevation myocardial infarction and multivessel disease (DANAMI-3-PRIMULTI): an open-label, randomised controlled trial. Lancet. 2015;386:665-671.
6. Smits PC, Abdel-Wahab M, Neumann FJ, et al. Fractional flow reserve-guided multivessel angioplasty in myocardial infarction. N Engl J Med. 2017;376:1234-1244.
7. Thiele H, Akin I, Sandri M, et al. PCI strategies in patients with acute myocardial infarction and cardiogenic shock. N Engl J Med. 2017;377:2419-2432.
8. Levine GN, Bates ER, Blankenship JC, et al. 2015 ACC/AHA/SCAI focused update on primary percutaneous coronary intervention for patients with ST-elevation myocardial infarction: an update of the 2011 ACCF/AHA/SCAI guideline for percutaneous coronary intervention and the 2013 ACCF/AHA guideline for the management of ST-elevation myocardial infarction. J Am Coll Cardiol. 2016;67:1235-1250.
Deferiprone noninferior to deferoxamine for iron overload in SCD, rare anemias
ORLANDO – The oral iron chelator deferiprone showed noninferiority to deferoxamine for treating iron overload in patients with sickle cell disease and other rare anemias in a randomized open-label trial.
The least squares mean change from baseline in liver iron concentration (LIC) – the primary study endpoint – was –4.04 mg/g dry weight (dw) in 152 patients randomized to receive deferiprone, and –4.45 mg/g dw in 76 who received deferoxamine, Janet L. Kwiatkowski, MD, of the Children’s Hospital of Philadelphia reported at the annual meeting of the American Society of Hematology.
The upper limit of the stringent 96.01% confidence interval used for the evaluation of noninferiority in the study was 1.57, thus the findings demonstrated noninferiority of deferiprone, Dr. Kwiatkowski said.
Deferiprone also showed noninferiority for the secondary endpoints of change in cardiac iron (about –0.02 ms on T2* MRI, log-transformed for both groups) and serum ferritin levels (–415 vs. –750 mcg/L for deferiprone vs. deferoxamine) at 12 months. The difference between the groups was not statistically significant for either endpoint.
Study participants, who had a mean age of 16.9 years, were aged 2 years and older with LIC between 7 and 30 mg/g dw. They were recruited from 33 sites in nine countries and randomized 2:1 to receive deferiprone or deferoxamine for up to 12 months; in patients with lower transfusional iron input and/or less severe iron load, deferiprone was dosed at 75 mg/kg daily and deferoxamine was dosed at 20 mg/kg for children and 40 mg/kg for adults. In those with higher iron input and/or more severe iron load, the deferiprone dose was 99 mg/kg daily and the deferoxamine doses were up to 40 mg/kg in children and up to 50 mg/kg for adults.
“Over the course of the treatment period, the dosage could be adjusted downward if there were side effects, or upward if there was no improvement in iron burden,” Dr Kwiatkowski said, adding that after 12 months, patients had the option of continuing on to a 2-year extension trial in which everyone received deferiprone.
No significant demographic differences were noted between the groups; 84% in both groups had sickle cell disease, and the remaining patients had other, rarer forms of transfusion-dependent anemia. Baseline iron burden was similar in the groups.
The rates of acceptable compliance over the course of the study were also similar at 69% and 79% in the deferiprone and deferoxamine arms, respectively, she noted.
No statistically significant difference between the groups was seen in the overall rate of adverse events, treatment-related AEs, serious AEs, or withdrawals from the study due to AEs. Agranulocytosis occurred in one deferiprone patient and zero deferoxamine patients, and mild or moderate neutropenia occurred in four patients and one patient in the groups, respectively.
All episodes resolved, no difference was seen in the rates of any of the serious AEs, and no unexpected serious adverse events occurred, she said.
Patients with sickle cell disease or other rare anemias whose care includes chronic blood transfusions require iron chelation to prevent iron overload. Currently, only deferoxamine and deferasirox are approved chelators in these patient populations, she said, noting that in 2011 deferiprone received accelerated Food and Drug Administration approval for the treatment of thalassemia.
The current study was conducted because of an FDA requirement for postmarket assessment of deferiprone’s efficacy and safety in patients with sickle cell disease and other anemias who develop transfusional iron overload. It was initiated prior to the approval of deferasirox for the first-line treatment of SCD, therefore it was compared only with deferoxamine, she explained.
Dr. Kwiatkowski reported research funding from Apopharma, bluebird bio, Novartis, and Terumo, and consultancy for Agios, bluebird bio, Celgene, and Imara.
ORLANDO – The oral iron chelator deferiprone showed noninferiority to deferoxamine for treating iron overload in patients with sickle cell disease and other rare anemias in a randomized open-label trial.
The least squares mean change from baseline in liver iron concentration (LIC) – the primary study endpoint – was –4.04 mg/g dry weight (dw) in 152 patients randomized to receive deferiprone, and –4.45 mg/g dw in 76 who received deferoxamine, Janet L. Kwiatkowski, MD, of the Children’s Hospital of Philadelphia reported at the annual meeting of the American Society of Hematology.
The upper limit of the stringent 96.01% confidence interval used for the evaluation of noninferiority in the study was 1.57, thus the findings demonstrated noninferiority of deferiprone, Dr. Kwiatkowski said.
Deferiprone also showed noninferiority for the secondary endpoints of change in cardiac iron (about –0.02 ms on T2* MRI, log-transformed for both groups) and serum ferritin levels (–415 vs. –750 mcg/L for deferiprone vs. deferoxamine) at 12 months. The difference between the groups was not statistically significant for either endpoint.
Study participants, who had a mean age of 16.9 years, were aged 2 years and older with LIC between 7 and 30 mg/g dw. They were recruited from 33 sites in nine countries and randomized 2:1 to receive deferiprone or deferoxamine for up to 12 months; in patients with lower transfusional iron input and/or less severe iron load, deferiprone was dosed at 75 mg/kg daily and deferoxamine was dosed at 20 mg/kg for children and 40 mg/kg for adults. In those with higher iron input and/or more severe iron load, the deferiprone dose was 99 mg/kg daily and the deferoxamine doses were up to 40 mg/kg in children and up to 50 mg/kg for adults.
“Over the course of the treatment period, the dosage could be adjusted downward if there were side effects, or upward if there was no improvement in iron burden,” Dr Kwiatkowski said, adding that after 12 months, patients had the option of continuing on to a 2-year extension trial in which everyone received deferiprone.
No significant demographic differences were noted between the groups; 84% in both groups had sickle cell disease, and the remaining patients had other, rarer forms of transfusion-dependent anemia. Baseline iron burden was similar in the groups.
The rates of acceptable compliance over the course of the study were also similar at 69% and 79% in the deferiprone and deferoxamine arms, respectively, she noted.
No statistically significant difference between the groups was seen in the overall rate of adverse events, treatment-related AEs, serious AEs, or withdrawals from the study due to AEs. Agranulocytosis occurred in one deferiprone patient and zero deferoxamine patients, and mild or moderate neutropenia occurred in four patients and one patient in the groups, respectively.
All episodes resolved, no difference was seen in the rates of any of the serious AEs, and no unexpected serious adverse events occurred, she said.
Patients with sickle cell disease or other rare anemias whose care includes chronic blood transfusions require iron chelation to prevent iron overload. Currently, only deferoxamine and deferasirox are approved chelators in these patient populations, she said, noting that in 2011 deferiprone received accelerated Food and Drug Administration approval for the treatment of thalassemia.
The current study was conducted because of an FDA requirement for postmarket assessment of deferiprone’s efficacy and safety in patients with sickle cell disease and other anemias who develop transfusional iron overload. It was initiated prior to the approval of deferasirox for the first-line treatment of SCD, therefore it was compared only with deferoxamine, she explained.
Dr. Kwiatkowski reported research funding from Apopharma, bluebird bio, Novartis, and Terumo, and consultancy for Agios, bluebird bio, Celgene, and Imara.
ORLANDO – The oral iron chelator deferiprone showed noninferiority to deferoxamine for treating iron overload in patients with sickle cell disease and other rare anemias in a randomized open-label trial.
The least squares mean change from baseline in liver iron concentration (LIC) – the primary study endpoint – was –4.04 mg/g dry weight (dw) in 152 patients randomized to receive deferiprone, and –4.45 mg/g dw in 76 who received deferoxamine, Janet L. Kwiatkowski, MD, of the Children’s Hospital of Philadelphia reported at the annual meeting of the American Society of Hematology.
The upper limit of the stringent 96.01% confidence interval used for the evaluation of noninferiority in the study was 1.57, thus the findings demonstrated noninferiority of deferiprone, Dr. Kwiatkowski said.
Deferiprone also showed noninferiority for the secondary endpoints of change in cardiac iron (about –0.02 ms on T2* MRI, log-transformed for both groups) and serum ferritin levels (–415 vs. –750 mcg/L for deferiprone vs. deferoxamine) at 12 months. The difference between the groups was not statistically significant for either endpoint.
Study participants, who had a mean age of 16.9 years, were aged 2 years and older with LIC between 7 and 30 mg/g dw. They were recruited from 33 sites in nine countries and randomized 2:1 to receive deferiprone or deferoxamine for up to 12 months; in patients with lower transfusional iron input and/or less severe iron load, deferiprone was dosed at 75 mg/kg daily and deferoxamine was dosed at 20 mg/kg for children and 40 mg/kg for adults. In those with higher iron input and/or more severe iron load, the deferiprone dose was 99 mg/kg daily and the deferoxamine doses were up to 40 mg/kg in children and up to 50 mg/kg for adults.
“Over the course of the treatment period, the dosage could be adjusted downward if there were side effects, or upward if there was no improvement in iron burden,” Dr Kwiatkowski said, adding that after 12 months, patients had the option of continuing on to a 2-year extension trial in which everyone received deferiprone.
No significant demographic differences were noted between the groups; 84% in both groups had sickle cell disease, and the remaining patients had other, rarer forms of transfusion-dependent anemia. Baseline iron burden was similar in the groups.
The rates of acceptable compliance over the course of the study were also similar at 69% and 79% in the deferiprone and deferoxamine arms, respectively, she noted.
No statistically significant difference between the groups was seen in the overall rate of adverse events, treatment-related AEs, serious AEs, or withdrawals from the study due to AEs. Agranulocytosis occurred in one deferiprone patient and zero deferoxamine patients, and mild or moderate neutropenia occurred in four patients and one patient in the groups, respectively.
All episodes resolved, no difference was seen in the rates of any of the serious AEs, and no unexpected serious adverse events occurred, she said.
Patients with sickle cell disease or other rare anemias whose care includes chronic blood transfusions require iron chelation to prevent iron overload. Currently, only deferoxamine and deferasirox are approved chelators in these patient populations, she said, noting that in 2011 deferiprone received accelerated Food and Drug Administration approval for the treatment of thalassemia.
The current study was conducted because of an FDA requirement for postmarket assessment of deferiprone’s efficacy and safety in patients with sickle cell disease and other anemias who develop transfusional iron overload. It was initiated prior to the approval of deferasirox for the first-line treatment of SCD, therefore it was compared only with deferoxamine, she explained.
Dr. Kwiatkowski reported research funding from Apopharma, bluebird bio, Novartis, and Terumo, and consultancy for Agios, bluebird bio, Celgene, and Imara.
REPORTING FROM ASH 2019
Initial ultrasound assessment of appendicitis curbs costs
Assessing appendicitis in children with initial ultrasound followed by computed tomography in the absence of appendix visualization and presence of secondary signs was the most cost-effective approach, according to data from a modeling study of 10 strategies.
Ultrasound is safer and less expensive than computed tomography and avoids radiation exposure; however, cost-effectiveness models of various approaches to imaging have not been well studied, wrote Rebecca Jennings, MD, of Seattle Children’s Hospital, Washington, and colleagues.
In a study published in Pediatrics, the researchers simulated a hypothetical patient population using a Markov cohort model and compared 10 different strategies including CT only, MRI only, and ultrasound followed by CT or MRI after ultrasounds that are negative or fail to visualize the appendix.
Overall, the most cost-effective strategy for moderate-risk patients was the use of ultrasound followed by CT or MRI if the ultrasound failed to visualize the appendix and secondary signs of inflammation were present in the right lower quadrant. The cost of this strategy was $4,815, with effectiveness of 0.99694 quality-adjusted life-years. “The most cost-effective strategy is highly dependent on a patient’s risk stratification,” the researchers noted. Based on their model, imaging was not cost effective for patients with a prevalence less than 16% or greater than 95%. However, those with appendicitis risk between 16% and 95% and no secondary signs of inflammation can forgo further imaging, even without visualization of the appendix for maximum cost-effectiveness, the researchers said.
The study was limited by several factors, including the inability to account for all potential costs related to imaging and outcomes, lack of accounting for the use of sedation when assessing costs, and inability to separate imaging costs from total hospital costs, the researchers noted. However, the results suggest that tailored imaging approaches based on patient risk are the most cost-effective strategies to assess appendicitis, they said.
“The diagnosis and exclusion of appendicitis continues to be one of the primary concerns of providers who care for children with abdominal pain,” wrote Rebecca M. Rentea, MD, and Charles L. Snyder, MD, of Children’s Mercy Hospital Kansas City, Mo., in an accompanying editorial (Pediatrics. 2020 Feb;145:e20193349).
“The best diagnostic and imaging approach to appendicitis has been a topic of interest for some time, and improvements such as appendicitis scoring systems, decreased use of ionized radiation, and adoption of clinical algorithms have been incremental but steady,” they said. Despite the potential of missed appendicitis, the use of an algorithm based on an initial ultrasound and previous possibility of appendicitis described in the study was the most cost effective, they said. In addition, “the ability to visualize the appendix did not alter the most cost-effective approach in those with a moderate risk of appendicitis (most patients),” they concluded.
The study was supported by the University of Washington and Seattle Children’s Hospital Quality Improvement Scholars Program. The researchers had no financial conflicts to disclose.
Dr. Rentea and Dr. Snyder had no financial conflicts to disclose.
SOURCE: Jennings R et al. Pediatrics. 2020. doi: 10.1542/peds.2019-1352.
Assessing appendicitis in children with initial ultrasound followed by computed tomography in the absence of appendix visualization and presence of secondary signs was the most cost-effective approach, according to data from a modeling study of 10 strategies.
Ultrasound is safer and less expensive than computed tomography and avoids radiation exposure; however, cost-effectiveness models of various approaches to imaging have not been well studied, wrote Rebecca Jennings, MD, of Seattle Children’s Hospital, Washington, and colleagues.
In a study published in Pediatrics, the researchers simulated a hypothetical patient population using a Markov cohort model and compared 10 different strategies including CT only, MRI only, and ultrasound followed by CT or MRI after ultrasounds that are negative or fail to visualize the appendix.
Overall, the most cost-effective strategy for moderate-risk patients was the use of ultrasound followed by CT or MRI if the ultrasound failed to visualize the appendix and secondary signs of inflammation were present in the right lower quadrant. The cost of this strategy was $4,815, with effectiveness of 0.99694 quality-adjusted life-years. “The most cost-effective strategy is highly dependent on a patient’s risk stratification,” the researchers noted. Based on their model, imaging was not cost effective for patients with a prevalence less than 16% or greater than 95%. However, those with appendicitis risk between 16% and 95% and no secondary signs of inflammation can forgo further imaging, even without visualization of the appendix for maximum cost-effectiveness, the researchers said.
The study was limited by several factors, including the inability to account for all potential costs related to imaging and outcomes, lack of accounting for the use of sedation when assessing costs, and inability to separate imaging costs from total hospital costs, the researchers noted. However, the results suggest that tailored imaging approaches based on patient risk are the most cost-effective strategies to assess appendicitis, they said.
“The diagnosis and exclusion of appendicitis continues to be one of the primary concerns of providers who care for children with abdominal pain,” wrote Rebecca M. Rentea, MD, and Charles L. Snyder, MD, of Children’s Mercy Hospital Kansas City, Mo., in an accompanying editorial (Pediatrics. 2020 Feb;145:e20193349).
“The best diagnostic and imaging approach to appendicitis has been a topic of interest for some time, and improvements such as appendicitis scoring systems, decreased use of ionized radiation, and adoption of clinical algorithms have been incremental but steady,” they said. Despite the potential of missed appendicitis, the use of an algorithm based on an initial ultrasound and previous possibility of appendicitis described in the study was the most cost effective, they said. In addition, “the ability to visualize the appendix did not alter the most cost-effective approach in those with a moderate risk of appendicitis (most patients),” they concluded.
The study was supported by the University of Washington and Seattle Children’s Hospital Quality Improvement Scholars Program. The researchers had no financial conflicts to disclose.
Dr. Rentea and Dr. Snyder had no financial conflicts to disclose.
SOURCE: Jennings R et al. Pediatrics. 2020. doi: 10.1542/peds.2019-1352.
Assessing appendicitis in children with initial ultrasound followed by computed tomography in the absence of appendix visualization and presence of secondary signs was the most cost-effective approach, according to data from a modeling study of 10 strategies.
Ultrasound is safer and less expensive than computed tomography and avoids radiation exposure; however, cost-effectiveness models of various approaches to imaging have not been well studied, wrote Rebecca Jennings, MD, of Seattle Children’s Hospital, Washington, and colleagues.
In a study published in Pediatrics, the researchers simulated a hypothetical patient population using a Markov cohort model and compared 10 different strategies including CT only, MRI only, and ultrasound followed by CT or MRI after ultrasounds that are negative or fail to visualize the appendix.
Overall, the most cost-effective strategy for moderate-risk patients was the use of ultrasound followed by CT or MRI if the ultrasound failed to visualize the appendix and secondary signs of inflammation were present in the right lower quadrant. The cost of this strategy was $4,815, with effectiveness of 0.99694 quality-adjusted life-years. “The most cost-effective strategy is highly dependent on a patient’s risk stratification,” the researchers noted. Based on their model, imaging was not cost effective for patients with a prevalence less than 16% or greater than 95%. However, those with appendicitis risk between 16% and 95% and no secondary signs of inflammation can forgo further imaging, even without visualization of the appendix for maximum cost-effectiveness, the researchers said.
The study was limited by several factors, including the inability to account for all potential costs related to imaging and outcomes, lack of accounting for the use of sedation when assessing costs, and inability to separate imaging costs from total hospital costs, the researchers noted. However, the results suggest that tailored imaging approaches based on patient risk are the most cost-effective strategies to assess appendicitis, they said.
“The diagnosis and exclusion of appendicitis continues to be one of the primary concerns of providers who care for children with abdominal pain,” wrote Rebecca M. Rentea, MD, and Charles L. Snyder, MD, of Children’s Mercy Hospital Kansas City, Mo., in an accompanying editorial (Pediatrics. 2020 Feb;145:e20193349).
“The best diagnostic and imaging approach to appendicitis has been a topic of interest for some time, and improvements such as appendicitis scoring systems, decreased use of ionized radiation, and adoption of clinical algorithms have been incremental but steady,” they said. Despite the potential of missed appendicitis, the use of an algorithm based on an initial ultrasound and previous possibility of appendicitis described in the study was the most cost effective, they said. In addition, “the ability to visualize the appendix did not alter the most cost-effective approach in those with a moderate risk of appendicitis (most patients),” they concluded.
The study was supported by the University of Washington and Seattle Children’s Hospital Quality Improvement Scholars Program. The researchers had no financial conflicts to disclose.
Dr. Rentea and Dr. Snyder had no financial conflicts to disclose.
SOURCE: Jennings R et al. Pediatrics. 2020. doi: 10.1542/peds.2019-1352.
FROM PEDIATRICS
Novel mutations contribute to progression of venetoclax-treated CLL
Newly discovered gene mutations in the progression of venetoclax-treated relapsed chronic lymphocytic leukemia (CLL) may improve understanding of clinical resistance mechanisms underlying the disease, according to recent research.
“We investigated patients with progressive CLL on venetoclax harboring subclonal BCL2 Gly101Val mutations for the presence of additional acquired BCL2 resistance mutations,” wrote Piers Blombery, MBBS, of the University of Melbourne in Victoria, Australia, and his colleagues in Blood.
Among 67 patients with relapsed disease treated with the BCL2 inhibitor venetoclax, the researchers identified a total of 11 patients with co-occurring BCL2 Gly101Val mutations. Each patient was enrolled in an early phase clinical trial at an institution in Australia.
With respect to testing methods, next-generation sequencing (NGS) and hybridization-based target enrichment technologies were used to detect novel acquired mutations in the BCL2 coding region.
Among those harboring the Gly101Val mutation, additional BCL2 mutations were identified in 10 patients (91%), with a median of three mutations detected per patient (range, 1-7). Previously undescribed mutations included an in-frame insertion mutation (Arg107_Arg110dup), and other substitutions (Asp103/Val156) in the BCL2 gene.
“As with the Gly101Val, these observations support the specificity of these mutations for the context of venetoclax resistance,” they wrote.
The investigators further explained that the BCL2 Asp103Glu mutation could have particular significance in the context of venetoclax sensitivity because of selective targeting of the BCL2 gene.
In comparison to wild-type aspartic acid, the BCL2 Asp103Glu substitution was linked to an approximate 20-fold reduction in affinity for venetoclax, they reported.
“[Our findings] consolidate the paradigm emerging across hematological malignancies of multiple independent molecular mechanisms underpinning an ‘oligoclonal’ pattern of clinical relapse on targeted therapies,” they concluded.
Further studies are needed to fully characterize the relationship between acquired BCL2 mutations and venetoclax resistance.
The study was funded by the Snowdome Foundation, Vision Super and the Wilson Centre for Lymphoma Genomics, the Leukemia and Lymphoma Society, the National Health and Medical Research Council of Australia, and other grant funding sources provided to the study authors. The authors reported financial affiliations with AbbVie, Genentech, and the Walter and Eliza Hall Institute.
Newly discovered gene mutations in the progression of venetoclax-treated relapsed chronic lymphocytic leukemia (CLL) may improve understanding of clinical resistance mechanisms underlying the disease, according to recent research.
“We investigated patients with progressive CLL on venetoclax harboring subclonal BCL2 Gly101Val mutations for the presence of additional acquired BCL2 resistance mutations,” wrote Piers Blombery, MBBS, of the University of Melbourne in Victoria, Australia, and his colleagues in Blood.
Among 67 patients with relapsed disease treated with the BCL2 inhibitor venetoclax, the researchers identified a total of 11 patients with co-occurring BCL2 Gly101Val mutations. Each patient was enrolled in an early phase clinical trial at an institution in Australia.
With respect to testing methods, next-generation sequencing (NGS) and hybridization-based target enrichment technologies were used to detect novel acquired mutations in the BCL2 coding region.
Among those harboring the Gly101Val mutation, additional BCL2 mutations were identified in 10 patients (91%), with a median of three mutations detected per patient (range, 1-7). Previously undescribed mutations included an in-frame insertion mutation (Arg107_Arg110dup), and other substitutions (Asp103/Val156) in the BCL2 gene.
“As with the Gly101Val, these observations support the specificity of these mutations for the context of venetoclax resistance,” they wrote.
The investigators further explained that the BCL2 Asp103Glu mutation could have particular significance in the context of venetoclax sensitivity because of selective targeting of the BCL2 gene.
In comparison to wild-type aspartic acid, the BCL2 Asp103Glu substitution was linked to an approximate 20-fold reduction in affinity for venetoclax, they reported.
“[Our findings] consolidate the paradigm emerging across hematological malignancies of multiple independent molecular mechanisms underpinning an ‘oligoclonal’ pattern of clinical relapse on targeted therapies,” they concluded.
Further studies are needed to fully characterize the relationship between acquired BCL2 mutations and venetoclax resistance.
The study was funded by the Snowdome Foundation, Vision Super and the Wilson Centre for Lymphoma Genomics, the Leukemia and Lymphoma Society, the National Health and Medical Research Council of Australia, and other grant funding sources provided to the study authors. The authors reported financial affiliations with AbbVie, Genentech, and the Walter and Eliza Hall Institute.
Newly discovered gene mutations in the progression of venetoclax-treated relapsed chronic lymphocytic leukemia (CLL) may improve understanding of clinical resistance mechanisms underlying the disease, according to recent research.
“We investigated patients with progressive CLL on venetoclax harboring subclonal BCL2 Gly101Val mutations for the presence of additional acquired BCL2 resistance mutations,” wrote Piers Blombery, MBBS, of the University of Melbourne in Victoria, Australia, and his colleagues in Blood.
Among 67 patients with relapsed disease treated with the BCL2 inhibitor venetoclax, the researchers identified a total of 11 patients with co-occurring BCL2 Gly101Val mutations. Each patient was enrolled in an early phase clinical trial at an institution in Australia.
With respect to testing methods, next-generation sequencing (NGS) and hybridization-based target enrichment technologies were used to detect novel acquired mutations in the BCL2 coding region.
Among those harboring the Gly101Val mutation, additional BCL2 mutations were identified in 10 patients (91%), with a median of three mutations detected per patient (range, 1-7). Previously undescribed mutations included an in-frame insertion mutation (Arg107_Arg110dup), and other substitutions (Asp103/Val156) in the BCL2 gene.
“As with the Gly101Val, these observations support the specificity of these mutations for the context of venetoclax resistance,” they wrote.
The investigators further explained that the BCL2 Asp103Glu mutation could have particular significance in the context of venetoclax sensitivity because of selective targeting of the BCL2 gene.
In comparison to wild-type aspartic acid, the BCL2 Asp103Glu substitution was linked to an approximate 20-fold reduction in affinity for venetoclax, they reported.
“[Our findings] consolidate the paradigm emerging across hematological malignancies of multiple independent molecular mechanisms underpinning an ‘oligoclonal’ pattern of clinical relapse on targeted therapies,” they concluded.
Further studies are needed to fully characterize the relationship between acquired BCL2 mutations and venetoclax resistance.
The study was funded by the Snowdome Foundation, Vision Super and the Wilson Centre for Lymphoma Genomics, the Leukemia and Lymphoma Society, the National Health and Medical Research Council of Australia, and other grant funding sources provided to the study authors. The authors reported financial affiliations with AbbVie, Genentech, and the Walter and Eliza Hall Institute.
FROM BLOOD
Medical scribe use linked to lower physician burnout
The incorporation of medical scribes into an outpatient oncology setting may lower physician burnout and improve patient care, according to a retrospective study.
“The objective of this study was to determine the effect of scribe integration on clinic workflow efficiency and physician satisfaction and quality of life in outpatient oncology clinics,” wrote Rebecca W. Gao, MD, of Stanford (Calif.) Medicine, and colleagues in the Journal of Oncology Practice.
The researchers retrospectively analyzed patient and survey data from 129 physicians connected with a tertiary care academic medical center during 2017-2019. In the study, 33 physicians were paired with a scribe, while 96 others were not.
During each patient encounter, visit duration times were recorded into an electronic medical record by a medical scribe. The scribes also performed a variety of other tasks, including collating lab results, documenting medical history, and completing postvisit summaries.
In the analysis, the team compared average visit duration times between physicians with and without a scribe. The effects of scribe integration on individual physician’s visit times were also assessed.
After analysis, the researchers found that physicians with a scribe experienced a 12.1% reduction in overall average patient visit duration, compared with visit times before scribe integration (P less than .0001). They also reported that less time was spent charting at the end of the day (P = .04).
“Compared with their peers, oncologists with scribes showed a 10%-20% decrease in the duration of all patient visits,” they explained.
With respect to patient care, survey results revealed that 90% of physicians strongly agreed they spent additional time with patients, and less time at the computer. “100% of physicians surveyed ‘strongly agreed’ that scribes improved their quality of life,” they added.
The researchers acknowledged that a key limitation of the study was the single-center design. As a result, these findings may not be applicable to physicians practicing in community-based settings.
Further studies could include financial analyses to evaluate the cost-effectiveness of medical scribe use in oncology practices, they noted.
“Our study suggests that scribes can be successfully integrated into oncology clinics and may benefit physician quality of life, clinic workflow efficiency, and the quality of physician-patient interactions,” they concluded.
The study was funded by the Stanford Cancer Center. One study author reported financial affiliations with SurgVision, Vergent Biotechnology, Novadaq Technologies, and LI-COR Biosciences.
SOURCE: Gao RW et al. J Oncol Pract. 2019 Dec 5. doi: 10.1200/JOP.19.00307.
The incorporation of medical scribes into an outpatient oncology setting may lower physician burnout and improve patient care, according to a retrospective study.
“The objective of this study was to determine the effect of scribe integration on clinic workflow efficiency and physician satisfaction and quality of life in outpatient oncology clinics,” wrote Rebecca W. Gao, MD, of Stanford (Calif.) Medicine, and colleagues in the Journal of Oncology Practice.
The researchers retrospectively analyzed patient and survey data from 129 physicians connected with a tertiary care academic medical center during 2017-2019. In the study, 33 physicians were paired with a scribe, while 96 others were not.
During each patient encounter, visit duration times were recorded into an electronic medical record by a medical scribe. The scribes also performed a variety of other tasks, including collating lab results, documenting medical history, and completing postvisit summaries.
In the analysis, the team compared average visit duration times between physicians with and without a scribe. The effects of scribe integration on individual physician’s visit times were also assessed.
After analysis, the researchers found that physicians with a scribe experienced a 12.1% reduction in overall average patient visit duration, compared with visit times before scribe integration (P less than .0001). They also reported that less time was spent charting at the end of the day (P = .04).
“Compared with their peers, oncologists with scribes showed a 10%-20% decrease in the duration of all patient visits,” they explained.
With respect to patient care, survey results revealed that 90% of physicians strongly agreed they spent additional time with patients, and less time at the computer. “100% of physicians surveyed ‘strongly agreed’ that scribes improved their quality of life,” they added.
The researchers acknowledged that a key limitation of the study was the single-center design. As a result, these findings may not be applicable to physicians practicing in community-based settings.
Further studies could include financial analyses to evaluate the cost-effectiveness of medical scribe use in oncology practices, they noted.
“Our study suggests that scribes can be successfully integrated into oncology clinics and may benefit physician quality of life, clinic workflow efficiency, and the quality of physician-patient interactions,” they concluded.
The study was funded by the Stanford Cancer Center. One study author reported financial affiliations with SurgVision, Vergent Biotechnology, Novadaq Technologies, and LI-COR Biosciences.
SOURCE: Gao RW et al. J Oncol Pract. 2019 Dec 5. doi: 10.1200/JOP.19.00307.
The incorporation of medical scribes into an outpatient oncology setting may lower physician burnout and improve patient care, according to a retrospective study.
“The objective of this study was to determine the effect of scribe integration on clinic workflow efficiency and physician satisfaction and quality of life in outpatient oncology clinics,” wrote Rebecca W. Gao, MD, of Stanford (Calif.) Medicine, and colleagues in the Journal of Oncology Practice.
The researchers retrospectively analyzed patient and survey data from 129 physicians connected with a tertiary care academic medical center during 2017-2019. In the study, 33 physicians were paired with a scribe, while 96 others were not.
During each patient encounter, visit duration times were recorded into an electronic medical record by a medical scribe. The scribes also performed a variety of other tasks, including collating lab results, documenting medical history, and completing postvisit summaries.
In the analysis, the team compared average visit duration times between physicians with and without a scribe. The effects of scribe integration on individual physician’s visit times were also assessed.
After analysis, the researchers found that physicians with a scribe experienced a 12.1% reduction in overall average patient visit duration, compared with visit times before scribe integration (P less than .0001). They also reported that less time was spent charting at the end of the day (P = .04).
“Compared with their peers, oncologists with scribes showed a 10%-20% decrease in the duration of all patient visits,” they explained.
With respect to patient care, survey results revealed that 90% of physicians strongly agreed they spent additional time with patients, and less time at the computer. “100% of physicians surveyed ‘strongly agreed’ that scribes improved their quality of life,” they added.
The researchers acknowledged that a key limitation of the study was the single-center design. As a result, these findings may not be applicable to physicians practicing in community-based settings.
Further studies could include financial analyses to evaluate the cost-effectiveness of medical scribe use in oncology practices, they noted.
“Our study suggests that scribes can be successfully integrated into oncology clinics and may benefit physician quality of life, clinic workflow efficiency, and the quality of physician-patient interactions,” they concluded.
The study was funded by the Stanford Cancer Center. One study author reported financial affiliations with SurgVision, Vergent Biotechnology, Novadaq Technologies, and LI-COR Biosciences.
SOURCE: Gao RW et al. J Oncol Pract. 2019 Dec 5. doi: 10.1200/JOP.19.00307.
FROM JOURNAL OF ONCOLOGY PRACTICE
Noninjectable modes of insulin delivery coming of age
LOS ANGELES – Injections may be the most common way for patients with diabetes to take insulin, but other modes of delivery are coming of age.
George Grunberger, MD, chairman of the Grunberger Diabetes Institute in Bloomfield Township, Mich., said that at least seven different agents that are being studied for the oral delivery of biologics for diabetes.
He outlined several at the World Congress on Insulin Resistance, Diabetes & Cardiovascular Disease.
Oral insulin
ORMD-0801 from Oramed is an oral insulin capsule that prevents enzyme degradation and enhances intestinal absorption. Top-line, unpublished findings from a phase 2 study, which the company announced in November 2019, showed that ORMD-0801 significantly reduced hemoglobin A1c levels in patients with type 2 diabetes who were inadequately controlled on other standard-of-care drugs. ORMD-0801 dosed once daily reduced HbA1c by 0.60%, compared with 0.06% by placebo. “We’ll see when it’s going to wind up in the clinic,” Dr. Grunberger said. Oramed is also developing an oral glucagonlike peptide–1 analogue capsule, ORMD-0901, which has potential to be the first orally ingestible GLP-1 analogue.
Inhaled and absorbed insulin
Technosphere insulin (Affreza) is a novel inhalation powder for the treatment of diabetes that was developed by MannKind and approved by the Food and Drug Administration in 2014. Clinical studies have shown that Technosphere insulin delivers insulin with an ultrarapid pharmacokinetic profile that is different from all other insulin products, but similar to natural insulin release. “The idea was to develop a more patient-friendly device to deliver insulin directly into the lungs,” said Dr. Grunberger, who is also a clinical professor of internal medicine and molecular medicine and genetics at Wayne State University, Detroit. “When you inhale this into the lungs, there is one cell layer between the air sac and the circulation, so it works very quickly. The idea is to try to avoid injecting insulin to see if it helps. This is a prandial insulin – you inhale it before meals. The whole idea is that hopefully, you can reduce any fear of delayed postprandial hyperglycemia.”
In a randomized trial of 353 patients with inadequately controlled type 2 diabetes, those in the Technosphere insulin arm significantly reduced HbA1c by 0.8% from a baseline of 8.3%, compared with the placebo arm, which was reduced by 0.4% (P less than .0001; Diabetes Care. 2015;38[12]:2274-81). A greater number of patients treated with Technosphere insulin achieved an HbA1c of 7.0% or less, compared with placebo (38% vs. 19%; P = .0005). Dr. Grunberger noted that, in clinical trials lasting up to 2 years, patients treated with Technosphere insulin had a 40-mL greater decline from baseline in forced expiratory volume in 1 second (FEV1 ), compared with patients treated with comparator antidiabetes treatments. “But once you stop using the drug, FEV1 reverts to normal,” he said. “So, there does not appear to be lasting damage to your lungs and respiratory ability.”
In another development, Oral-Lyn from Generex Biotechnology, which delivers insulin through the oral mucosa, is being evaluated as a potential treatment option. In 2015, Generex partnered with the University of Toronto’s Center for Molecular Design and Preformulations to increase the bioavailability of insulin in the product and to reduce the number of sprays required to achieve effective prandial glucose control. In 2019, the company formed the NuGenerex Diabetes Research Center, which intended to accelerate the development of the reformulated Oral-Lyn-2, for type 2 diabetes, and Altsulin, for the treatment of type 1 diabetes. The programs are expected to initiate in the first quarter of 2020.
In the meantime, studies of intranasally delivered insulin continue to advance. “It works. It lowers glucose, but there is a whole slew of knowledge now about how it can also improve neurocognitive function,” Dr. Grunberger said.
Oral GLP-1 receptor agonists
Oral versions of glucagonlike peptide–1 (GLP-1) receptor agonists are also emerging as a treatment option. The FDA recently approved the first oral GLP-1 receptor agonist, semaglutide bound in the absorption enhancer sodium N‐(8‐[2‐hydroxybenzoyl] amino) caprylate (SNAC). According to data from manufacturer Novo Nordisk, SNAC facilitates local increase of pH, which leads to a higher solubility. SNAC interacts with cell membranes of gastric mucosa, facilitating absorption within 30 minutes, “so the drug can penetrate the mucosa without lasting damage,” Dr. Grunberger said. The SNAC effect is size dependent and fully reversible.
In PIONEER 3, researchers found that, in adults with type 2 diabetes uncontrolled with metformin with or without sulfonylurea, oral semaglutide at dosages of 7 and 14 mg/day resulted in significantly greater reductions in HbA1c over 26 weeks, compared with sitagliptin, but there was no significant benefit with the 3-mg/d dosage (JAMA. 2019;321[15]:1466-80). In PIONEER 4, researchers compared the efficacy and safety of oral semaglutide with subcutaneous liraglutide (Lancet. 2019;394[10192]:P39-50). “There was no difference in HbA1c effect between the two groups, but oral semaglutide beat out sitagliptin in terms of weight loss,” Dr. Grunberger said. “It’s going to be interesting to see what’s going to happen in the marketplace as the drug gets widely launched.”
Nasal glucagon
He closed out his presentation by discussing the July 2019 FDA approval of Eli Lilly’s nasal glucagon for severe hypoglycemia – the first such treatment that can be administered without an injection. The nasally administered dry powder, known as Baqsimi, is a welcome alternative to current glucagon kits, “which contain multiple components,” said Dr. Grunberger, who is also a past president of the American Association of Clinical Endocrinologists. An adult pivotal study showed that supraphysiologic levels of glucagon were achieved within 5 minutes with both nasal and intramuscular glucagon (Diabetes Care. 2016;39[2]:264-70). Headache and nasal symptoms occurred more frequently with nasal glucagon, but most were resolved within 1 day. In addition, nausea and vomiting occurred at similar frequencies with nasal and intramuscular glucacon, and most cases were resolved within 1 day.
Similar results were observed in a pediatric study of 48 patients with type 1 diabetes who were older than 4 years, (Diabetes Care. 2016;39[4]:555-62).
Dr. Grunberger disclosed that has research contracts with Medtronic and Eli Lilly, and that he serves on speakers bureaus of Eli Lilly, Janssen, Novo Nordisk, and Sanofi.
LOS ANGELES – Injections may be the most common way for patients with diabetes to take insulin, but other modes of delivery are coming of age.
George Grunberger, MD, chairman of the Grunberger Diabetes Institute in Bloomfield Township, Mich., said that at least seven different agents that are being studied for the oral delivery of biologics for diabetes.
He outlined several at the World Congress on Insulin Resistance, Diabetes & Cardiovascular Disease.
Oral insulin
ORMD-0801 from Oramed is an oral insulin capsule that prevents enzyme degradation and enhances intestinal absorption. Top-line, unpublished findings from a phase 2 study, which the company announced in November 2019, showed that ORMD-0801 significantly reduced hemoglobin A1c levels in patients with type 2 diabetes who were inadequately controlled on other standard-of-care drugs. ORMD-0801 dosed once daily reduced HbA1c by 0.60%, compared with 0.06% by placebo. “We’ll see when it’s going to wind up in the clinic,” Dr. Grunberger said. Oramed is also developing an oral glucagonlike peptide–1 analogue capsule, ORMD-0901, which has potential to be the first orally ingestible GLP-1 analogue.
Inhaled and absorbed insulin
Technosphere insulin (Affreza) is a novel inhalation powder for the treatment of diabetes that was developed by MannKind and approved by the Food and Drug Administration in 2014. Clinical studies have shown that Technosphere insulin delivers insulin with an ultrarapid pharmacokinetic profile that is different from all other insulin products, but similar to natural insulin release. “The idea was to develop a more patient-friendly device to deliver insulin directly into the lungs,” said Dr. Grunberger, who is also a clinical professor of internal medicine and molecular medicine and genetics at Wayne State University, Detroit. “When you inhale this into the lungs, there is one cell layer between the air sac and the circulation, so it works very quickly. The idea is to try to avoid injecting insulin to see if it helps. This is a prandial insulin – you inhale it before meals. The whole idea is that hopefully, you can reduce any fear of delayed postprandial hyperglycemia.”
In a randomized trial of 353 patients with inadequately controlled type 2 diabetes, those in the Technosphere insulin arm significantly reduced HbA1c by 0.8% from a baseline of 8.3%, compared with the placebo arm, which was reduced by 0.4% (P less than .0001; Diabetes Care. 2015;38[12]:2274-81). A greater number of patients treated with Technosphere insulin achieved an HbA1c of 7.0% or less, compared with placebo (38% vs. 19%; P = .0005). Dr. Grunberger noted that, in clinical trials lasting up to 2 years, patients treated with Technosphere insulin had a 40-mL greater decline from baseline in forced expiratory volume in 1 second (FEV1 ), compared with patients treated with comparator antidiabetes treatments. “But once you stop using the drug, FEV1 reverts to normal,” he said. “So, there does not appear to be lasting damage to your lungs and respiratory ability.”
In another development, Oral-Lyn from Generex Biotechnology, which delivers insulin through the oral mucosa, is being evaluated as a potential treatment option. In 2015, Generex partnered with the University of Toronto’s Center for Molecular Design and Preformulations to increase the bioavailability of insulin in the product and to reduce the number of sprays required to achieve effective prandial glucose control. In 2019, the company formed the NuGenerex Diabetes Research Center, which intended to accelerate the development of the reformulated Oral-Lyn-2, for type 2 diabetes, and Altsulin, for the treatment of type 1 diabetes. The programs are expected to initiate in the first quarter of 2020.
In the meantime, studies of intranasally delivered insulin continue to advance. “It works. It lowers glucose, but there is a whole slew of knowledge now about how it can also improve neurocognitive function,” Dr. Grunberger said.
Oral GLP-1 receptor agonists
Oral versions of glucagonlike peptide–1 (GLP-1) receptor agonists are also emerging as a treatment option. The FDA recently approved the first oral GLP-1 receptor agonist, semaglutide bound in the absorption enhancer sodium N‐(8‐[2‐hydroxybenzoyl] amino) caprylate (SNAC). According to data from manufacturer Novo Nordisk, SNAC facilitates local increase of pH, which leads to a higher solubility. SNAC interacts with cell membranes of gastric mucosa, facilitating absorption within 30 minutes, “so the drug can penetrate the mucosa without lasting damage,” Dr. Grunberger said. The SNAC effect is size dependent and fully reversible.
In PIONEER 3, researchers found that, in adults with type 2 diabetes uncontrolled with metformin with or without sulfonylurea, oral semaglutide at dosages of 7 and 14 mg/day resulted in significantly greater reductions in HbA1c over 26 weeks, compared with sitagliptin, but there was no significant benefit with the 3-mg/d dosage (JAMA. 2019;321[15]:1466-80). In PIONEER 4, researchers compared the efficacy and safety of oral semaglutide with subcutaneous liraglutide (Lancet. 2019;394[10192]:P39-50). “There was no difference in HbA1c effect between the two groups, but oral semaglutide beat out sitagliptin in terms of weight loss,” Dr. Grunberger said. “It’s going to be interesting to see what’s going to happen in the marketplace as the drug gets widely launched.”
Nasal glucagon
He closed out his presentation by discussing the July 2019 FDA approval of Eli Lilly’s nasal glucagon for severe hypoglycemia – the first such treatment that can be administered without an injection. The nasally administered dry powder, known as Baqsimi, is a welcome alternative to current glucagon kits, “which contain multiple components,” said Dr. Grunberger, who is also a past president of the American Association of Clinical Endocrinologists. An adult pivotal study showed that supraphysiologic levels of glucagon were achieved within 5 minutes with both nasal and intramuscular glucagon (Diabetes Care. 2016;39[2]:264-70). Headache and nasal symptoms occurred more frequently with nasal glucagon, but most were resolved within 1 day. In addition, nausea and vomiting occurred at similar frequencies with nasal and intramuscular glucacon, and most cases were resolved within 1 day.
Similar results were observed in a pediatric study of 48 patients with type 1 diabetes who were older than 4 years, (Diabetes Care. 2016;39[4]:555-62).
Dr. Grunberger disclosed that has research contracts with Medtronic and Eli Lilly, and that he serves on speakers bureaus of Eli Lilly, Janssen, Novo Nordisk, and Sanofi.
LOS ANGELES – Injections may be the most common way for patients with diabetes to take insulin, but other modes of delivery are coming of age.
George Grunberger, MD, chairman of the Grunberger Diabetes Institute in Bloomfield Township, Mich., said that at least seven different agents that are being studied for the oral delivery of biologics for diabetes.
He outlined several at the World Congress on Insulin Resistance, Diabetes & Cardiovascular Disease.
Oral insulin
ORMD-0801 from Oramed is an oral insulin capsule that prevents enzyme degradation and enhances intestinal absorption. Top-line, unpublished findings from a phase 2 study, which the company announced in November 2019, showed that ORMD-0801 significantly reduced hemoglobin A1c levels in patients with type 2 diabetes who were inadequately controlled on other standard-of-care drugs. ORMD-0801 dosed once daily reduced HbA1c by 0.60%, compared with 0.06% by placebo. “We’ll see when it’s going to wind up in the clinic,” Dr. Grunberger said. Oramed is also developing an oral glucagonlike peptide–1 analogue capsule, ORMD-0901, which has potential to be the first orally ingestible GLP-1 analogue.
Inhaled and absorbed insulin
Technosphere insulin (Affreza) is a novel inhalation powder for the treatment of diabetes that was developed by MannKind and approved by the Food and Drug Administration in 2014. Clinical studies have shown that Technosphere insulin delivers insulin with an ultrarapid pharmacokinetic profile that is different from all other insulin products, but similar to natural insulin release. “The idea was to develop a more patient-friendly device to deliver insulin directly into the lungs,” said Dr. Grunberger, who is also a clinical professor of internal medicine and molecular medicine and genetics at Wayne State University, Detroit. “When you inhale this into the lungs, there is one cell layer between the air sac and the circulation, so it works very quickly. The idea is to try to avoid injecting insulin to see if it helps. This is a prandial insulin – you inhale it before meals. The whole idea is that hopefully, you can reduce any fear of delayed postprandial hyperglycemia.”
In a randomized trial of 353 patients with inadequately controlled type 2 diabetes, those in the Technosphere insulin arm significantly reduced HbA1c by 0.8% from a baseline of 8.3%, compared with the placebo arm, which was reduced by 0.4% (P less than .0001; Diabetes Care. 2015;38[12]:2274-81). A greater number of patients treated with Technosphere insulin achieved an HbA1c of 7.0% or less, compared with placebo (38% vs. 19%; P = .0005). Dr. Grunberger noted that, in clinical trials lasting up to 2 years, patients treated with Technosphere insulin had a 40-mL greater decline from baseline in forced expiratory volume in 1 second (FEV1 ), compared with patients treated with comparator antidiabetes treatments. “But once you stop using the drug, FEV1 reverts to normal,” he said. “So, there does not appear to be lasting damage to your lungs and respiratory ability.”
In another development, Oral-Lyn from Generex Biotechnology, which delivers insulin through the oral mucosa, is being evaluated as a potential treatment option. In 2015, Generex partnered with the University of Toronto’s Center for Molecular Design and Preformulations to increase the bioavailability of insulin in the product and to reduce the number of sprays required to achieve effective prandial glucose control. In 2019, the company formed the NuGenerex Diabetes Research Center, which intended to accelerate the development of the reformulated Oral-Lyn-2, for type 2 diabetes, and Altsulin, for the treatment of type 1 diabetes. The programs are expected to initiate in the first quarter of 2020.
In the meantime, studies of intranasally delivered insulin continue to advance. “It works. It lowers glucose, but there is a whole slew of knowledge now about how it can also improve neurocognitive function,” Dr. Grunberger said.
Oral GLP-1 receptor agonists
Oral versions of glucagonlike peptide–1 (GLP-1) receptor agonists are also emerging as a treatment option. The FDA recently approved the first oral GLP-1 receptor agonist, semaglutide bound in the absorption enhancer sodium N‐(8‐[2‐hydroxybenzoyl] amino) caprylate (SNAC). According to data from manufacturer Novo Nordisk, SNAC facilitates local increase of pH, which leads to a higher solubility. SNAC interacts with cell membranes of gastric mucosa, facilitating absorption within 30 minutes, “so the drug can penetrate the mucosa without lasting damage,” Dr. Grunberger said. The SNAC effect is size dependent and fully reversible.
In PIONEER 3, researchers found that, in adults with type 2 diabetes uncontrolled with metformin with or without sulfonylurea, oral semaglutide at dosages of 7 and 14 mg/day resulted in significantly greater reductions in HbA1c over 26 weeks, compared with sitagliptin, but there was no significant benefit with the 3-mg/d dosage (JAMA. 2019;321[15]:1466-80). In PIONEER 4, researchers compared the efficacy and safety of oral semaglutide with subcutaneous liraglutide (Lancet. 2019;394[10192]:P39-50). “There was no difference in HbA1c effect between the two groups, but oral semaglutide beat out sitagliptin in terms of weight loss,” Dr. Grunberger said. “It’s going to be interesting to see what’s going to happen in the marketplace as the drug gets widely launched.”
Nasal glucagon
He closed out his presentation by discussing the July 2019 FDA approval of Eli Lilly’s nasal glucagon for severe hypoglycemia – the first such treatment that can be administered without an injection. The nasally administered dry powder, known as Baqsimi, is a welcome alternative to current glucagon kits, “which contain multiple components,” said Dr. Grunberger, who is also a past president of the American Association of Clinical Endocrinologists. An adult pivotal study showed that supraphysiologic levels of glucagon were achieved within 5 minutes with both nasal and intramuscular glucagon (Diabetes Care. 2016;39[2]:264-70). Headache and nasal symptoms occurred more frequently with nasal glucagon, but most were resolved within 1 day. In addition, nausea and vomiting occurred at similar frequencies with nasal and intramuscular glucacon, and most cases were resolved within 1 day.
Similar results were observed in a pediatric study of 48 patients with type 1 diabetes who were older than 4 years, (Diabetes Care. 2016;39[4]:555-62).
Dr. Grunberger disclosed that has research contracts with Medtronic and Eli Lilly, and that he serves on speakers bureaus of Eli Lilly, Janssen, Novo Nordisk, and Sanofi.
EXPERT ANALYSIS FROM WCIRDC 2019
Expanded indication for leadless pacemaker triples eligible patients
The U.S. Food and Drug Administration’s approval of an expanded indication for a leadless pacemaker for patients “who may benefit from maintenance of atrioventricular synchrony” will make this technology potentially available to nearly half of the Americans who need a pacemaker, roughly triple the number of patients who have been candidates for a leadless pacemaker up to now.
“This approval was huge. The complication rate with leadless pacemakers has been 63% less than the rate using pacemakers with transvenous leads,” said Larry A. Chinitz, MD, a cardiac electrophysiologist and a coinvestigator on some of the studies that led to the new indication. By expanding the types of patients suitable for leadless pacing “we’ll achieve AV [atrioventricular] synchrony in more patients with fewer complications,” said Dr. Chinitz, professor of medicine and director of the Cardiac Electrophysiology and Heart Rhythm Center at NYU Langone Health in New York.
Because the device is both leadless and requires no pocket owing to its small size and placement in a patient’s right ventricle, it has implications for potentially broadening the population that could benefit from the device, he said in an interview. “When we started with this pacemaker, it was limited to elderly patients with persistent atrial fibrillation who needed only ventricular pacing, a very small group,” just under 15% of the universe of patients who need pacemakers. The broadened indication, for patients with high-grade AV block who also have atrial function, makes it possible to think of using this safer and easier-to-place device in patients who need infrequent pacing, and in patients with multiple comorbidities that give them an increased complication risk, he said. The new indication means “you’re treating a much broader patient population, doing it more safely, and creating the foundation for expanding this technology.”
The Micra AV pacemaker uses the same basic design as the previously approved Micra Transcatheter Pacing System, which came onto the U.S. market in 2016 and provides single-chamber pacing. An accelerometer on the device allows it to detect atrial motion and thereby synchronize ventricular and atrial contractions, which led to the new indication. Although the Micra AV device looks similar to the original single-chamber model, it has an entirely new circuitry that prolongs battery life during dual-chamber pacing as well as new software that incorporates the accelerometer data, explained Robert Kowal, MD, a cardiac electrophysiologist, and vice president of medical affairs and chief medical officer of cardiac rhythm and heart failure at Medtronic in Minneapolis. The battery of the Micra AV is designed to last about 15 years, Dr. Chinitz noted.
Results from two studies that Dr. Chinitz helped run established the safety and efficacy of the device for dual-chamber pacing. The MARVEL (Micra Atrial Tracking Using a Ventricular Accelerometer) study included 64 patients who completed the study at 12 worldwide centers, which produced an average 80% AV synchrony in 33 patients with high-degree AV block (The other patients in the study had predominantly intrinsic AV conduction; Heart Rhythm. 2018 Sep;15[9]:1363-71). The MARVEL 2 study included 75 patients with either second- or third-degree AV block at 12 worldwide centers and showed that AV synchrony increased from an average of 27% without two-chamber pacing to 89% with the dual-chamber function turned on, and with 95% of patients achieving at least 70% AV synchrony (JACC Clin Electrophysiol. 2020 Jan;6[1]:94-106).
The 2016 indication for single-chamber pacing included patients with “high-grade” AV bloc with or without atrial fibrillation, typically patients for whom dual-chamber pacemaker was not a great option because of the risks for complication but with the downside of limited AV synchrony, a limitation now mitigated by the option of mechanical synchronization, Dr. Kowal said. The AV device remains intended for patients with high-grade AV node block, which means patients with second- or third-degree block, he added in an interview. The estimated prevalence of third-degree AV block among U.S. adults is about 0.02%, which translates into about 50,000 people; the estimated prevalence of second-degree AV block is much less, about 10% of the third-degree prevalence.
Despite the substantial cut in complications by a leadless and pocketless pacemaker, “some patients may still benefit from a traditional dual-chamber pacemaker,” specifically active patients who might sometimes get their heart rates up with exercise to levels of about 150 beats/min or higher, Dr. Kowal said. That’s because currently the programing algorithms used to synchronize the ventricle and atrium become less reliable at heart rates above 105 beats/min, he explained. However, the ability for mechanical synchronization to keep up at higher heart rates should improve as additional data are collected that can refine the algorithms. It’s also unusual for most patients who are pacemaker candidates to reach heart rates this high, he said.
The MARVEL and MARVEL 2 studies were sponsored by Medtronic, the company that markets Micra pacemakers. Dr. Chinitz has received fees and fellowship support from Medtronic, and has also received fees from Abbott, Biosense Webster, Biotronik, and Pfizer, and he has also received fellowship support from Biotronik and Boston Scientific. Dr. Kowal is a Medtronic employee.
The U.S. Food and Drug Administration’s approval of an expanded indication for a leadless pacemaker for patients “who may benefit from maintenance of atrioventricular synchrony” will make this technology potentially available to nearly half of the Americans who need a pacemaker, roughly triple the number of patients who have been candidates for a leadless pacemaker up to now.
“This approval was huge. The complication rate with leadless pacemakers has been 63% less than the rate using pacemakers with transvenous leads,” said Larry A. Chinitz, MD, a cardiac electrophysiologist and a coinvestigator on some of the studies that led to the new indication. By expanding the types of patients suitable for leadless pacing “we’ll achieve AV [atrioventricular] synchrony in more patients with fewer complications,” said Dr. Chinitz, professor of medicine and director of the Cardiac Electrophysiology and Heart Rhythm Center at NYU Langone Health in New York.
Because the device is both leadless and requires no pocket owing to its small size and placement in a patient’s right ventricle, it has implications for potentially broadening the population that could benefit from the device, he said in an interview. “When we started with this pacemaker, it was limited to elderly patients with persistent atrial fibrillation who needed only ventricular pacing, a very small group,” just under 15% of the universe of patients who need pacemakers. The broadened indication, for patients with high-grade AV block who also have atrial function, makes it possible to think of using this safer and easier-to-place device in patients who need infrequent pacing, and in patients with multiple comorbidities that give them an increased complication risk, he said. The new indication means “you’re treating a much broader patient population, doing it more safely, and creating the foundation for expanding this technology.”
The Micra AV pacemaker uses the same basic design as the previously approved Micra Transcatheter Pacing System, which came onto the U.S. market in 2016 and provides single-chamber pacing. An accelerometer on the device allows it to detect atrial motion and thereby synchronize ventricular and atrial contractions, which led to the new indication. Although the Micra AV device looks similar to the original single-chamber model, it has an entirely new circuitry that prolongs battery life during dual-chamber pacing as well as new software that incorporates the accelerometer data, explained Robert Kowal, MD, a cardiac electrophysiologist, and vice president of medical affairs and chief medical officer of cardiac rhythm and heart failure at Medtronic in Minneapolis. The battery of the Micra AV is designed to last about 15 years, Dr. Chinitz noted.
Results from two studies that Dr. Chinitz helped run established the safety and efficacy of the device for dual-chamber pacing. The MARVEL (Micra Atrial Tracking Using a Ventricular Accelerometer) study included 64 patients who completed the study at 12 worldwide centers, which produced an average 80% AV synchrony in 33 patients with high-degree AV block (The other patients in the study had predominantly intrinsic AV conduction; Heart Rhythm. 2018 Sep;15[9]:1363-71). The MARVEL 2 study included 75 patients with either second- or third-degree AV block at 12 worldwide centers and showed that AV synchrony increased from an average of 27% without two-chamber pacing to 89% with the dual-chamber function turned on, and with 95% of patients achieving at least 70% AV synchrony (JACC Clin Electrophysiol. 2020 Jan;6[1]:94-106).
The 2016 indication for single-chamber pacing included patients with “high-grade” AV bloc with or without atrial fibrillation, typically patients for whom dual-chamber pacemaker was not a great option because of the risks for complication but with the downside of limited AV synchrony, a limitation now mitigated by the option of mechanical synchronization, Dr. Kowal said. The AV device remains intended for patients with high-grade AV node block, which means patients with second- or third-degree block, he added in an interview. The estimated prevalence of third-degree AV block among U.S. adults is about 0.02%, which translates into about 50,000 people; the estimated prevalence of second-degree AV block is much less, about 10% of the third-degree prevalence.
Despite the substantial cut in complications by a leadless and pocketless pacemaker, “some patients may still benefit from a traditional dual-chamber pacemaker,” specifically active patients who might sometimes get their heart rates up with exercise to levels of about 150 beats/min or higher, Dr. Kowal said. That’s because currently the programing algorithms used to synchronize the ventricle and atrium become less reliable at heart rates above 105 beats/min, he explained. However, the ability for mechanical synchronization to keep up at higher heart rates should improve as additional data are collected that can refine the algorithms. It’s also unusual for most patients who are pacemaker candidates to reach heart rates this high, he said.
The MARVEL and MARVEL 2 studies were sponsored by Medtronic, the company that markets Micra pacemakers. Dr. Chinitz has received fees and fellowship support from Medtronic, and has also received fees from Abbott, Biosense Webster, Biotronik, and Pfizer, and he has also received fellowship support from Biotronik and Boston Scientific. Dr. Kowal is a Medtronic employee.
The U.S. Food and Drug Administration’s approval of an expanded indication for a leadless pacemaker for patients “who may benefit from maintenance of atrioventricular synchrony” will make this technology potentially available to nearly half of the Americans who need a pacemaker, roughly triple the number of patients who have been candidates for a leadless pacemaker up to now.
“This approval was huge. The complication rate with leadless pacemakers has been 63% less than the rate using pacemakers with transvenous leads,” said Larry A. Chinitz, MD, a cardiac electrophysiologist and a coinvestigator on some of the studies that led to the new indication. By expanding the types of patients suitable for leadless pacing “we’ll achieve AV [atrioventricular] synchrony in more patients with fewer complications,” said Dr. Chinitz, professor of medicine and director of the Cardiac Electrophysiology and Heart Rhythm Center at NYU Langone Health in New York.
Because the device is both leadless and requires no pocket owing to its small size and placement in a patient’s right ventricle, it has implications for potentially broadening the population that could benefit from the device, he said in an interview. “When we started with this pacemaker, it was limited to elderly patients with persistent atrial fibrillation who needed only ventricular pacing, a very small group,” just under 15% of the universe of patients who need pacemakers. The broadened indication, for patients with high-grade AV block who also have atrial function, makes it possible to think of using this safer and easier-to-place device in patients who need infrequent pacing, and in patients with multiple comorbidities that give them an increased complication risk, he said. The new indication means “you’re treating a much broader patient population, doing it more safely, and creating the foundation for expanding this technology.”
The Micra AV pacemaker uses the same basic design as the previously approved Micra Transcatheter Pacing System, which came onto the U.S. market in 2016 and provides single-chamber pacing. An accelerometer on the device allows it to detect atrial motion and thereby synchronize ventricular and atrial contractions, which led to the new indication. Although the Micra AV device looks similar to the original single-chamber model, it has an entirely new circuitry that prolongs battery life during dual-chamber pacing as well as new software that incorporates the accelerometer data, explained Robert Kowal, MD, a cardiac electrophysiologist, and vice president of medical affairs and chief medical officer of cardiac rhythm and heart failure at Medtronic in Minneapolis. The battery of the Micra AV is designed to last about 15 years, Dr. Chinitz noted.
Results from two studies that Dr. Chinitz helped run established the safety and efficacy of the device for dual-chamber pacing. The MARVEL (Micra Atrial Tracking Using a Ventricular Accelerometer) study included 64 patients who completed the study at 12 worldwide centers, which produced an average 80% AV synchrony in 33 patients with high-degree AV block (The other patients in the study had predominantly intrinsic AV conduction; Heart Rhythm. 2018 Sep;15[9]:1363-71). The MARVEL 2 study included 75 patients with either second- or third-degree AV block at 12 worldwide centers and showed that AV synchrony increased from an average of 27% without two-chamber pacing to 89% with the dual-chamber function turned on, and with 95% of patients achieving at least 70% AV synchrony (JACC Clin Electrophysiol. 2020 Jan;6[1]:94-106).
The 2016 indication for single-chamber pacing included patients with “high-grade” AV bloc with or without atrial fibrillation, typically patients for whom dual-chamber pacemaker was not a great option because of the risks for complication but with the downside of limited AV synchrony, a limitation now mitigated by the option of mechanical synchronization, Dr. Kowal said. The AV device remains intended for patients with high-grade AV node block, which means patients with second- or third-degree block, he added in an interview. The estimated prevalence of third-degree AV block among U.S. adults is about 0.02%, which translates into about 50,000 people; the estimated prevalence of second-degree AV block is much less, about 10% of the third-degree prevalence.
Despite the substantial cut in complications by a leadless and pocketless pacemaker, “some patients may still benefit from a traditional dual-chamber pacemaker,” specifically active patients who might sometimes get their heart rates up with exercise to levels of about 150 beats/min or higher, Dr. Kowal said. That’s because currently the programing algorithms used to synchronize the ventricle and atrium become less reliable at heart rates above 105 beats/min, he explained. However, the ability for mechanical synchronization to keep up at higher heart rates should improve as additional data are collected that can refine the algorithms. It’s also unusual for most patients who are pacemaker candidates to reach heart rates this high, he said.
The MARVEL and MARVEL 2 studies were sponsored by Medtronic, the company that markets Micra pacemakers. Dr. Chinitz has received fees and fellowship support from Medtronic, and has also received fees from Abbott, Biosense Webster, Biotronik, and Pfizer, and he has also received fellowship support from Biotronik and Boston Scientific. Dr. Kowal is a Medtronic employee.
ECHELON-1 update: A+AVD bests ABVD in Hodgkin lymphoma
Brentuximab vedotin plus doxorubicin, vinblastine, and dacarbazine (A+AVD) provides “robust, sustained efficacy” in patients with Hodgkin lymphoma, according to investigators.
In the ECHELON-1 trial, investigators compared A+AVD to doxorubicin, bleomycin, vinblastine, and dacarbazine (ABVD) as frontline treatment for stage III or IV Hodgkin lymphoma. The 3-year progression-free survival (PFS) was superior in patients who received A+AVD, and this benefit was seen across most subgroups.
David J. Straus, MD, of Memorial Sloan Kettering Cancer Center in New York and his colleagues detailed these findings in Blood.
The phase 3 trial (NCT01712490) enrolled 1,334 patients with stage III or IV classical Hodgkin lymphoma. They were randomized to receive A+AVD (n = 664) or ABVD (n = 670). Baseline characteristics were similar between the treatment arms.
Positron emission tomography status after cycle 2 (PET2) was similar between the treatment arms as well. Most patients – 89% of the A+AVD arm and 86% of the ABVD arm – were PET2 negative. Treating physicians used PET2 status as a guide to potentially switch patients to an alternative regimen (radiotherapy or chemotherapy with or without transplant).
In a prior analysis, the study’s primary endpoint was modified PFS (time to progression, death, or noncomplete response after frontline therapy) per an independent review committee (N Engl J Med. 2018;378:331-44). The 2-year modified PFS rate was 82.1% in the A+AVD arm and 77.2% in the ABVD arm (hazard ratio, 0.77; P = .04).
PFS update
In the current analysis, the main exploratory endpoint was PFS per investigator. The 3-year PFS rate was significantly higher in the A+AVD arm than in the ABVD arm – 83.1% and 76.0%, respectively (HR, 0.704; P = .005).
The investigators observed a “consistent improvement in PFS” in the A+AVD arm, regardless of disease stage, International Prognostic score, Eastern Cooperative Oncology Group status, sex, or age. There was a significant improvement in PFS with A+AVD in PET2-negative patients and a trend toward improvement in PET2-positive patients. In the PET2-negative patients, the 3-year PFS was 85.8% in the A+AVD arm and 79.5% in the ABVD arm (HR, 0.69; P = .009). In PET2-positive patients, the 3-year PFS was 67.7% and 51.5%, respectively (HR, 0.59; P = .077).
“These data highlight that A+AVD provides a durable efficacy benefit, compared with ABVD, for frontline stage III/IV cHL [classical Hodgkin lymphoma], which is consistent across key subgroups regardless of patient status at PET2,” Dr. Straus and his colleagues wrote.
Safety update
In both treatment arms, peripheral neuropathy continued to improve or resolve with longer follow-up. Among patients who developed peripheral neuropathy, 78% in the A+AVD arm and 83% in the ABVD arm had improvement or resolution of the condition at 3 years.
Most patients had complete resolution of peripheral neuropathy; 62% in the A+AVD arm and 73% in the ABVD arm. The median time to complete resolution was 28 weeks (range, 0-167 weeks) after stopping A+AVD and 14 weeks (range, 0-188 weeks) after stopping ABVD.
The incidence of secondary malignancies was similar between the treatment arms. There were 14 secondary malignancies in the A+AVD arm (6 solid tumors, 8 hematologic malignancies) and 20 in the ABVD arm (9 solid tumors, 11 hematologic malignancies).
“A+AVD provided a sustained PFS benefit with a predictable and manageable safety profile,” Dr. Straus and colleagues wrote. “These data further support the advantages of A+AVD versus ABVD as frontline treatment of patients with advanced stage III or IV cHL [classical Hodgkin lymphoma].”
The ECHELON-1 trial was sponsored by Millennium Pharmaceuticals (a subsidiary of Takeda) and Seattle Genetics. The investigators disclosed relationships with Millennium, Takeda, Seattle Genetics, and a range of other companies.
SOURCE: Straus DJ et al. Blood. 2020 Jan 16. pii: blood.2019003127. doi: 10.1182/blood.2019003127.
Brentuximab vedotin plus doxorubicin, vinblastine, and dacarbazine (A+AVD) provides “robust, sustained efficacy” in patients with Hodgkin lymphoma, according to investigators.
In the ECHELON-1 trial, investigators compared A+AVD to doxorubicin, bleomycin, vinblastine, and dacarbazine (ABVD) as frontline treatment for stage III or IV Hodgkin lymphoma. The 3-year progression-free survival (PFS) was superior in patients who received A+AVD, and this benefit was seen across most subgroups.
David J. Straus, MD, of Memorial Sloan Kettering Cancer Center in New York and his colleagues detailed these findings in Blood.
The phase 3 trial (NCT01712490) enrolled 1,334 patients with stage III or IV classical Hodgkin lymphoma. They were randomized to receive A+AVD (n = 664) or ABVD (n = 670). Baseline characteristics were similar between the treatment arms.
Positron emission tomography status after cycle 2 (PET2) was similar between the treatment arms as well. Most patients – 89% of the A+AVD arm and 86% of the ABVD arm – were PET2 negative. Treating physicians used PET2 status as a guide to potentially switch patients to an alternative regimen (radiotherapy or chemotherapy with or without transplant).
In a prior analysis, the study’s primary endpoint was modified PFS (time to progression, death, or noncomplete response after frontline therapy) per an independent review committee (N Engl J Med. 2018;378:331-44). The 2-year modified PFS rate was 82.1% in the A+AVD arm and 77.2% in the ABVD arm (hazard ratio, 0.77; P = .04).
PFS update
In the current analysis, the main exploratory endpoint was PFS per investigator. The 3-year PFS rate was significantly higher in the A+AVD arm than in the ABVD arm – 83.1% and 76.0%, respectively (HR, 0.704; P = .005).
The investigators observed a “consistent improvement in PFS” in the A+AVD arm, regardless of disease stage, International Prognostic score, Eastern Cooperative Oncology Group status, sex, or age. There was a significant improvement in PFS with A+AVD in PET2-negative patients and a trend toward improvement in PET2-positive patients. In the PET2-negative patients, the 3-year PFS was 85.8% in the A+AVD arm and 79.5% in the ABVD arm (HR, 0.69; P = .009). In PET2-positive patients, the 3-year PFS was 67.7% and 51.5%, respectively (HR, 0.59; P = .077).
“These data highlight that A+AVD provides a durable efficacy benefit, compared with ABVD, for frontline stage III/IV cHL [classical Hodgkin lymphoma], which is consistent across key subgroups regardless of patient status at PET2,” Dr. Straus and his colleagues wrote.
Safety update
In both treatment arms, peripheral neuropathy continued to improve or resolve with longer follow-up. Among patients who developed peripheral neuropathy, 78% in the A+AVD arm and 83% in the ABVD arm had improvement or resolution of the condition at 3 years.
Most patients had complete resolution of peripheral neuropathy; 62% in the A+AVD arm and 73% in the ABVD arm. The median time to complete resolution was 28 weeks (range, 0-167 weeks) after stopping A+AVD and 14 weeks (range, 0-188 weeks) after stopping ABVD.
The incidence of secondary malignancies was similar between the treatment arms. There were 14 secondary malignancies in the A+AVD arm (6 solid tumors, 8 hematologic malignancies) and 20 in the ABVD arm (9 solid tumors, 11 hematologic malignancies).
“A+AVD provided a sustained PFS benefit with a predictable and manageable safety profile,” Dr. Straus and colleagues wrote. “These data further support the advantages of A+AVD versus ABVD as frontline treatment of patients with advanced stage III or IV cHL [classical Hodgkin lymphoma].”
The ECHELON-1 trial was sponsored by Millennium Pharmaceuticals (a subsidiary of Takeda) and Seattle Genetics. The investigators disclosed relationships with Millennium, Takeda, Seattle Genetics, and a range of other companies.
SOURCE: Straus DJ et al. Blood. 2020 Jan 16. pii: blood.2019003127. doi: 10.1182/blood.2019003127.
Brentuximab vedotin plus doxorubicin, vinblastine, and dacarbazine (A+AVD) provides “robust, sustained efficacy” in patients with Hodgkin lymphoma, according to investigators.
In the ECHELON-1 trial, investigators compared A+AVD to doxorubicin, bleomycin, vinblastine, and dacarbazine (ABVD) as frontline treatment for stage III or IV Hodgkin lymphoma. The 3-year progression-free survival (PFS) was superior in patients who received A+AVD, and this benefit was seen across most subgroups.
David J. Straus, MD, of Memorial Sloan Kettering Cancer Center in New York and his colleagues detailed these findings in Blood.
The phase 3 trial (NCT01712490) enrolled 1,334 patients with stage III or IV classical Hodgkin lymphoma. They were randomized to receive A+AVD (n = 664) or ABVD (n = 670). Baseline characteristics were similar between the treatment arms.
Positron emission tomography status after cycle 2 (PET2) was similar between the treatment arms as well. Most patients – 89% of the A+AVD arm and 86% of the ABVD arm – were PET2 negative. Treating physicians used PET2 status as a guide to potentially switch patients to an alternative regimen (radiotherapy or chemotherapy with or without transplant).
In a prior analysis, the study’s primary endpoint was modified PFS (time to progression, death, or noncomplete response after frontline therapy) per an independent review committee (N Engl J Med. 2018;378:331-44). The 2-year modified PFS rate was 82.1% in the A+AVD arm and 77.2% in the ABVD arm (hazard ratio, 0.77; P = .04).
PFS update
In the current analysis, the main exploratory endpoint was PFS per investigator. The 3-year PFS rate was significantly higher in the A+AVD arm than in the ABVD arm – 83.1% and 76.0%, respectively (HR, 0.704; P = .005).
The investigators observed a “consistent improvement in PFS” in the A+AVD arm, regardless of disease stage, International Prognostic score, Eastern Cooperative Oncology Group status, sex, or age. There was a significant improvement in PFS with A+AVD in PET2-negative patients and a trend toward improvement in PET2-positive patients. In the PET2-negative patients, the 3-year PFS was 85.8% in the A+AVD arm and 79.5% in the ABVD arm (HR, 0.69; P = .009). In PET2-positive patients, the 3-year PFS was 67.7% and 51.5%, respectively (HR, 0.59; P = .077).
“These data highlight that A+AVD provides a durable efficacy benefit, compared with ABVD, for frontline stage III/IV cHL [classical Hodgkin lymphoma], which is consistent across key subgroups regardless of patient status at PET2,” Dr. Straus and his colleagues wrote.
Safety update
In both treatment arms, peripheral neuropathy continued to improve or resolve with longer follow-up. Among patients who developed peripheral neuropathy, 78% in the A+AVD arm and 83% in the ABVD arm had improvement or resolution of the condition at 3 years.
Most patients had complete resolution of peripheral neuropathy; 62% in the A+AVD arm and 73% in the ABVD arm. The median time to complete resolution was 28 weeks (range, 0-167 weeks) after stopping A+AVD and 14 weeks (range, 0-188 weeks) after stopping ABVD.
The incidence of secondary malignancies was similar between the treatment arms. There were 14 secondary malignancies in the A+AVD arm (6 solid tumors, 8 hematologic malignancies) and 20 in the ABVD arm (9 solid tumors, 11 hematologic malignancies).
“A+AVD provided a sustained PFS benefit with a predictable and manageable safety profile,” Dr. Straus and colleagues wrote. “These data further support the advantages of A+AVD versus ABVD as frontline treatment of patients with advanced stage III or IV cHL [classical Hodgkin lymphoma].”
The ECHELON-1 trial was sponsored by Millennium Pharmaceuticals (a subsidiary of Takeda) and Seattle Genetics. The investigators disclosed relationships with Millennium, Takeda, Seattle Genetics, and a range of other companies.
SOURCE: Straus DJ et al. Blood. 2020 Jan 16. pii: blood.2019003127. doi: 10.1182/blood.2019003127.
FROM BLOOD
Most epidermolysis bullosa patients turn to topical antimicrobials
Most patients with epidermolysis bullosa who use topical products choose antimicrobials, according to data from a survey of 202 children and adults.
Management of epidermolysis bullosa (EB) involves a combination of skin protection and infection management, but patient home care practices have not been well studied, wrote Leila Shayegan of Columbia University, New York, and colleagues.
In a study published in Pediatric Dermatology, the researchers surveyed 202 patients who were enrolled in the Epidermolysis Bullosa Clinical Characterization and Outcomes Database during 2017. The patients ranged in age from 1 month to 62 years with an average age of 11 years; 52% were female. The patients represented a range of EB subtypes, including 130 patients with dystrophic EB, 51 patients with EB simplex, 21 with junctional EB, and 3 patients each with Kindler syndrome and unspecified subtypes.
Overall, most of the patients reported cleaning their skin either every day (37%) or every other day (32%). Of the 188 patients who reported using topical products on their wounds, 131 (70%) said they used at least one antimicrobial product, while 125 patients (66%) reported using at least one emollient; 32 (17%) used emollients only, and 21(11%) reported no use of topical products.
The most popular topical antibiotics were mupirocin (31%) and bacitracin (31%). In addition, 14% of respondents used silver-containing products, and 16% used medical-grade honey. Roughly half (51%) of patients who reported use of at least one antimicrobial product used two or more different antimicrobial products.
A total of 38% of patients used only water for cleansing. Of the 131 patients who reported using additives in their cleansing water, 57% added salt, 54% added bleach, 27% added vinegar, and 26% reported “other” additive use, which could include Epsom salt, baking soda, oatmeal, or essential oils, the researchers said. The concentrations of these additives ranged from barely effective 0.002% sodium hypochlorite and 0.002% acetic acid solutions to potentially cytotoxic solutions of 0.09% sodium hypochlorite and 0.156% acetic acid.
“Although the survey was not designed to correlate skin care practices with wound culture results and resistance patterns, widespread use of topical antimicrobials described among EB patients highlights the need for increased emphasis on antibiotic stewardship,” the researchers noted. They added that health care providers should educate patients and families not only about mindful use of antibiotics, but also appropriate concentrations of cleansing additives.
“Optimizing EB patient home skin care routines, along with future longitudinal studies on the impact of EB skin care interventions on microbial resistance patterns, wound healing and [squamous cell carcinoma] risk are necessary to improve outcomes for patients with EB,” they emphasized.
The Epidermolysis Bullosa Clinical Characterization and Outcomes Database used in the study is funded by the Epidermolysis Bullosa Research Partnership and the Epidermolysis Bullosa Medical Research Foundation. Ms. Shayegan had no financial conflicts to disclose. Several coauthors disclosed relationships with multiple companies including Abeona Therapeutics, Castle Creek Pharmaceuticals, Fibrocell Science, ProQR, and Scioderm.
SOURCE: Shayegan L et al. Pediatr Dermatol. 2020. doi: 10.1111/pde.14102.
Most patients with epidermolysis bullosa who use topical products choose antimicrobials, according to data from a survey of 202 children and adults.
Management of epidermolysis bullosa (EB) involves a combination of skin protection and infection management, but patient home care practices have not been well studied, wrote Leila Shayegan of Columbia University, New York, and colleagues.
In a study published in Pediatric Dermatology, the researchers surveyed 202 patients who were enrolled in the Epidermolysis Bullosa Clinical Characterization and Outcomes Database during 2017. The patients ranged in age from 1 month to 62 years with an average age of 11 years; 52% were female. The patients represented a range of EB subtypes, including 130 patients with dystrophic EB, 51 patients with EB simplex, 21 with junctional EB, and 3 patients each with Kindler syndrome and unspecified subtypes.
Overall, most of the patients reported cleaning their skin either every day (37%) or every other day (32%). Of the 188 patients who reported using topical products on their wounds, 131 (70%) said they used at least one antimicrobial product, while 125 patients (66%) reported using at least one emollient; 32 (17%) used emollients only, and 21(11%) reported no use of topical products.
The most popular topical antibiotics were mupirocin (31%) and bacitracin (31%). In addition, 14% of respondents used silver-containing products, and 16% used medical-grade honey. Roughly half (51%) of patients who reported use of at least one antimicrobial product used two or more different antimicrobial products.
A total of 38% of patients used only water for cleansing. Of the 131 patients who reported using additives in their cleansing water, 57% added salt, 54% added bleach, 27% added vinegar, and 26% reported “other” additive use, which could include Epsom salt, baking soda, oatmeal, or essential oils, the researchers said. The concentrations of these additives ranged from barely effective 0.002% sodium hypochlorite and 0.002% acetic acid solutions to potentially cytotoxic solutions of 0.09% sodium hypochlorite and 0.156% acetic acid.
“Although the survey was not designed to correlate skin care practices with wound culture results and resistance patterns, widespread use of topical antimicrobials described among EB patients highlights the need for increased emphasis on antibiotic stewardship,” the researchers noted. They added that health care providers should educate patients and families not only about mindful use of antibiotics, but also appropriate concentrations of cleansing additives.
“Optimizing EB patient home skin care routines, along with future longitudinal studies on the impact of EB skin care interventions on microbial resistance patterns, wound healing and [squamous cell carcinoma] risk are necessary to improve outcomes for patients with EB,” they emphasized.
The Epidermolysis Bullosa Clinical Characterization and Outcomes Database used in the study is funded by the Epidermolysis Bullosa Research Partnership and the Epidermolysis Bullosa Medical Research Foundation. Ms. Shayegan had no financial conflicts to disclose. Several coauthors disclosed relationships with multiple companies including Abeona Therapeutics, Castle Creek Pharmaceuticals, Fibrocell Science, ProQR, and Scioderm.
SOURCE: Shayegan L et al. Pediatr Dermatol. 2020. doi: 10.1111/pde.14102.
Most patients with epidermolysis bullosa who use topical products choose antimicrobials, according to data from a survey of 202 children and adults.
Management of epidermolysis bullosa (EB) involves a combination of skin protection and infection management, but patient home care practices have not been well studied, wrote Leila Shayegan of Columbia University, New York, and colleagues.
In a study published in Pediatric Dermatology, the researchers surveyed 202 patients who were enrolled in the Epidermolysis Bullosa Clinical Characterization and Outcomes Database during 2017. The patients ranged in age from 1 month to 62 years with an average age of 11 years; 52% were female. The patients represented a range of EB subtypes, including 130 patients with dystrophic EB, 51 patients with EB simplex, 21 with junctional EB, and 3 patients each with Kindler syndrome and unspecified subtypes.
Overall, most of the patients reported cleaning their skin either every day (37%) or every other day (32%). Of the 188 patients who reported using topical products on their wounds, 131 (70%) said they used at least one antimicrobial product, while 125 patients (66%) reported using at least one emollient; 32 (17%) used emollients only, and 21(11%) reported no use of topical products.
The most popular topical antibiotics were mupirocin (31%) and bacitracin (31%). In addition, 14% of respondents used silver-containing products, and 16% used medical-grade honey. Roughly half (51%) of patients who reported use of at least one antimicrobial product used two or more different antimicrobial products.
A total of 38% of patients used only water for cleansing. Of the 131 patients who reported using additives in their cleansing water, 57% added salt, 54% added bleach, 27% added vinegar, and 26% reported “other” additive use, which could include Epsom salt, baking soda, oatmeal, or essential oils, the researchers said. The concentrations of these additives ranged from barely effective 0.002% sodium hypochlorite and 0.002% acetic acid solutions to potentially cytotoxic solutions of 0.09% sodium hypochlorite and 0.156% acetic acid.
“Although the survey was not designed to correlate skin care practices with wound culture results and resistance patterns, widespread use of topical antimicrobials described among EB patients highlights the need for increased emphasis on antibiotic stewardship,” the researchers noted. They added that health care providers should educate patients and families not only about mindful use of antibiotics, but also appropriate concentrations of cleansing additives.
“Optimizing EB patient home skin care routines, along with future longitudinal studies on the impact of EB skin care interventions on microbial resistance patterns, wound healing and [squamous cell carcinoma] risk are necessary to improve outcomes for patients with EB,” they emphasized.
The Epidermolysis Bullosa Clinical Characterization and Outcomes Database used in the study is funded by the Epidermolysis Bullosa Research Partnership and the Epidermolysis Bullosa Medical Research Foundation. Ms. Shayegan had no financial conflicts to disclose. Several coauthors disclosed relationships with multiple companies including Abeona Therapeutics, Castle Creek Pharmaceuticals, Fibrocell Science, ProQR, and Scioderm.
SOURCE: Shayegan L et al. Pediatr Dermatol. 2020. doi: 10.1111/pde.14102.
FROM PEDIATRIC DERMATOLOGY