User login
14% of ASCVD patients need a PCSK9 inhibitor to reach LDL goal
ROME – An estimated 14% of Americans with atherosclerotic cardiovascular disease can’t reach the LDL cholesterol goal of less than 70 mg/dL on maximal intensified oral lipid-lowering therapy and thus are candidates for a PCSK9 inhibitor such as alirocumab, Christopher P. Cannon, MD, reported at the annual congress of the European Society of Cardiology.
After adding alirocumab (Praluent) at 75 mg by subcutaneous injection every 2 weeks, that figure drops to 2%. And by increasing the alirocumab dose to 150 mg in that 2%, the result is that fewer than 1% of patients with atherosclerotic cardiovascular disease (ASCVD) will have an LDL cholesterol level of 70 mg/dL or more, assuming no tolerability issues along the way, added Dr. Cannon, professor of medicine at Harvard Medical School, Boston.
This Monte Carlo simulation relies on lipid-lowering treatment outcome rates from published landmark clinical trials such as IMPROVE-IT (N Engl J Med. 2015 Jun 18;372[25]:2387-97), for which Dr. Cannon was a lead investigator, as well as data from the ongoing ODYSSEY program of alirocumab studies. Importantly, the model doesn’t factor in drug intolerance.
In this model, the average age of the hypothetical 1 million ASCVD patients was 66.5 years and 54.6% were men. The distribution of ASCVD diagnoses was representative of the real-world experience: 70% had coronary heart disease, 25% had ischemic cerebrovascular disease, 35% had peripheral artery disease, and 5% had experienced an acute coronary syndrome within the past 12 months.
Current guidelines would strongly recommend that all of these patients be on lipid-lowering therapy, yet only 53% were at baseline. Guideline-recommended lipid-lowering strategies would suggest that those patients not on a lipid-lowering drug be placed on atorvastatin at 20 mg/day; by that step, 50% of the 1 million ASCVD patients would be at the goal of an LDL cholesterol level below 70 mg/dL.
For the other 50%, a reasonable next step would be a high-intensity statin: say, atorvastatin at 80 mg/day instead of 20. That would leave only 21% of the original ASCVD population with an LDL cholesterol level of 70 mg/dL or higher. The next step for those patients, as established in IMPROVE-IT, would be to add ezetimibe (Zetia). That constitutes maximal oral lipid-lowering therapy, and 14% of the original ASCVD population would still have an LDL cholesterol level of 70 mg/dL or more on that multidrug regimen.
On the basis of the results of the ODYSSEY trials, adding alirocumab at 75 mg would drop that figure from 14% down to 2%. And by switching to alirocumab at 150 mg every 2 weeks in those outliers, less than 1% of the 1 million patients with ASCVD would still have an LDL cholesterol level of 70 mg/dL or more. The mean LDL cholesterol level would be 52.0 mg/dL in patients on full treatment intensification with a high-dose statin, ezetimibe, and alirocumab.
If future studies were to establish that the new LDL cholesterol level goal for patients with known ASCVD was less than 55 mg/dL, the simulation indicates that just under 59% of patients on full-on treatment intensification including alirocumab would achieve it, according to Dr. Cannon.
He reported receiving research grants from and/or serving as a consultant to well over a dozen pharmaceutical companies, including Sanofi and Regeneron, which sponsored this analysis.
ROME – An estimated 14% of Americans with atherosclerotic cardiovascular disease can’t reach the LDL cholesterol goal of less than 70 mg/dL on maximal intensified oral lipid-lowering therapy and thus are candidates for a PCSK9 inhibitor such as alirocumab, Christopher P. Cannon, MD, reported at the annual congress of the European Society of Cardiology.
After adding alirocumab (Praluent) at 75 mg by subcutaneous injection every 2 weeks, that figure drops to 2%. And by increasing the alirocumab dose to 150 mg in that 2%, the result is that fewer than 1% of patients with atherosclerotic cardiovascular disease (ASCVD) will have an LDL cholesterol level of 70 mg/dL or more, assuming no tolerability issues along the way, added Dr. Cannon, professor of medicine at Harvard Medical School, Boston.
This Monte Carlo simulation relies on lipid-lowering treatment outcome rates from published landmark clinical trials such as IMPROVE-IT (N Engl J Med. 2015 Jun 18;372[25]:2387-97), for which Dr. Cannon was a lead investigator, as well as data from the ongoing ODYSSEY program of alirocumab studies. Importantly, the model doesn’t factor in drug intolerance.
In this model, the average age of the hypothetical 1 million ASCVD patients was 66.5 years and 54.6% were men. The distribution of ASCVD diagnoses was representative of the real-world experience: 70% had coronary heart disease, 25% had ischemic cerebrovascular disease, 35% had peripheral artery disease, and 5% had experienced an acute coronary syndrome within the past 12 months.
Current guidelines would strongly recommend that all of these patients be on lipid-lowering therapy, yet only 53% were at baseline. Guideline-recommended lipid-lowering strategies would suggest that those patients not on a lipid-lowering drug be placed on atorvastatin at 20 mg/day; by that step, 50% of the 1 million ASCVD patients would be at the goal of an LDL cholesterol level below 70 mg/dL.
For the other 50%, a reasonable next step would be a high-intensity statin: say, atorvastatin at 80 mg/day instead of 20. That would leave only 21% of the original ASCVD population with an LDL cholesterol level of 70 mg/dL or higher. The next step for those patients, as established in IMPROVE-IT, would be to add ezetimibe (Zetia). That constitutes maximal oral lipid-lowering therapy, and 14% of the original ASCVD population would still have an LDL cholesterol level of 70 mg/dL or more on that multidrug regimen.
On the basis of the results of the ODYSSEY trials, adding alirocumab at 75 mg would drop that figure from 14% down to 2%. And by switching to alirocumab at 150 mg every 2 weeks in those outliers, less than 1% of the 1 million patients with ASCVD would still have an LDL cholesterol level of 70 mg/dL or more. The mean LDL cholesterol level would be 52.0 mg/dL in patients on full treatment intensification with a high-dose statin, ezetimibe, and alirocumab.
If future studies were to establish that the new LDL cholesterol level goal for patients with known ASCVD was less than 55 mg/dL, the simulation indicates that just under 59% of patients on full-on treatment intensification including alirocumab would achieve it, according to Dr. Cannon.
He reported receiving research grants from and/or serving as a consultant to well over a dozen pharmaceutical companies, including Sanofi and Regeneron, which sponsored this analysis.
ROME – An estimated 14% of Americans with atherosclerotic cardiovascular disease can’t reach the LDL cholesterol goal of less than 70 mg/dL on maximal intensified oral lipid-lowering therapy and thus are candidates for a PCSK9 inhibitor such as alirocumab, Christopher P. Cannon, MD, reported at the annual congress of the European Society of Cardiology.
After adding alirocumab (Praluent) at 75 mg by subcutaneous injection every 2 weeks, that figure drops to 2%. And by increasing the alirocumab dose to 150 mg in that 2%, the result is that fewer than 1% of patients with atherosclerotic cardiovascular disease (ASCVD) will have an LDL cholesterol level of 70 mg/dL or more, assuming no tolerability issues along the way, added Dr. Cannon, professor of medicine at Harvard Medical School, Boston.
This Monte Carlo simulation relies on lipid-lowering treatment outcome rates from published landmark clinical trials such as IMPROVE-IT (N Engl J Med. 2015 Jun 18;372[25]:2387-97), for which Dr. Cannon was a lead investigator, as well as data from the ongoing ODYSSEY program of alirocumab studies. Importantly, the model doesn’t factor in drug intolerance.
In this model, the average age of the hypothetical 1 million ASCVD patients was 66.5 years and 54.6% were men. The distribution of ASCVD diagnoses was representative of the real-world experience: 70% had coronary heart disease, 25% had ischemic cerebrovascular disease, 35% had peripheral artery disease, and 5% had experienced an acute coronary syndrome within the past 12 months.
Current guidelines would strongly recommend that all of these patients be on lipid-lowering therapy, yet only 53% were at baseline. Guideline-recommended lipid-lowering strategies would suggest that those patients not on a lipid-lowering drug be placed on atorvastatin at 20 mg/day; by that step, 50% of the 1 million ASCVD patients would be at the goal of an LDL cholesterol level below 70 mg/dL.
For the other 50%, a reasonable next step would be a high-intensity statin: say, atorvastatin at 80 mg/day instead of 20. That would leave only 21% of the original ASCVD population with an LDL cholesterol level of 70 mg/dL or higher. The next step for those patients, as established in IMPROVE-IT, would be to add ezetimibe (Zetia). That constitutes maximal oral lipid-lowering therapy, and 14% of the original ASCVD population would still have an LDL cholesterol level of 70 mg/dL or more on that multidrug regimen.
On the basis of the results of the ODYSSEY trials, adding alirocumab at 75 mg would drop that figure from 14% down to 2%. And by switching to alirocumab at 150 mg every 2 weeks in those outliers, less than 1% of the 1 million patients with ASCVD would still have an LDL cholesterol level of 70 mg/dL or more. The mean LDL cholesterol level would be 52.0 mg/dL in patients on full treatment intensification with a high-dose statin, ezetimibe, and alirocumab.
If future studies were to establish that the new LDL cholesterol level goal for patients with known ASCVD was less than 55 mg/dL, the simulation indicates that just under 59% of patients on full-on treatment intensification including alirocumab would achieve it, according to Dr. Cannon.
He reported receiving research grants from and/or serving as a consultant to well over a dozen pharmaceutical companies, including Sanofi and Regeneron, which sponsored this analysis.
AT THE ESC CONGRESS 2016
Key clinical point:
Major finding: The combination of a high-intensity statin, ezetimibe, and alirocumab should enable more than 99% of Americans with atherosclerotic cardiovascular disease to achieve an LDL cholesterol level below 70 mg/dL.
Data source: This Monte Carlo simulation model created a hypothetical cohort of 1 million Americans with ASCVD and utilized outcome data from landmark clinical trials to estimate the patients’ ability to achieve a guideline-recommended LDL cholesterol level below 70 mg/dL in response to various intensities of lipid-lowering therapy.
Disclosures: This analysis was funded by Sanofi and Regeneron. The presenter reported receiving research grants from and serving as a consultant to those pharmaceutical companies and more than a dozen others.
PsA bone loss measurement: A surrogate for radiographic progression?
An advanced computer assisted digital x-ray radiogrammetry technique that measures bone thickness has the potential to be a surrogate marker of radiographic progression in psoriatic arthritis, according to a report in Arthritis Research & Therapy.
The method uses software called BoneXpert to sensitively differentiate between the different stages of disease manifestation affecting bone integrity. Digital x-ray radiogrammetry (DXR) with BoneXpert has a clinical advantage over standard techniques such as radiographs through its ability to be integrated into a picture archiving and communication system that allows direct image analysis and quantification of bone loss, according to the study authors, led by Alexander Pfeil, MD, of Jena (Germany) University Hospital – Friedrich Schiller University.
The researchers used the computer-assisted diagnosis software to measure the metacarpal index (MCI) and its cortical thickness score (MCI T-score) in the metacarpal bones of 104 psoriatic arthritis (PsA) patients who fulfilled the CASPAR criteria. All patients were treated either with nonsteroidal anti-inflammatory drugs or disease-modifying antirheumatic drugs (Arthritis Res Ther. 2016;18:248. doi: 10.1186/s13075-016-1145-4).
In the total PsA cohort, the MCI T-score showed a significantly reduced negative value of –1.289. “The reduced MCI T-score was clearly associated with a reduced bone mineral density of the metacarpal bones in PsA,” the investigators wrote.
For all scores, the researchers found a severity-dependent reduction for the BoneXpert parameters of MCI, MCI T-score, T, and Bone Health Index.
The strongest reductions were seen for MCI and T using the Proliferation Score (MCI: –28.3%; T: –31.9%) and the Destruction Score (MCI: –30.8%; T: –30.9%) of the Psoriatic Arthritis Ratingen Score.
A reduced MCI and T-score was directly associated with cortical thinning and the periarticular demineralization of the metacarpal bones, and highlighted a direct association with bone destruction and bone proliferation in PsA, the investigators said.
“The measurement of periarticular bone loss can be considered a complementary approach to verify PsA-related bony changes and a surrogate marker for PsA progression,” the researchers suggested.
The technique’s high reproducibility can also be used to optimize an appropriate individual therapeutic strategy, they added.
The study had no specific funding source, and the authors declared no conflicts of interest.
An advanced computer assisted digital x-ray radiogrammetry technique that measures bone thickness has the potential to be a surrogate marker of radiographic progression in psoriatic arthritis, according to a report in Arthritis Research & Therapy.
The method uses software called BoneXpert to sensitively differentiate between the different stages of disease manifestation affecting bone integrity. Digital x-ray radiogrammetry (DXR) with BoneXpert has a clinical advantage over standard techniques such as radiographs through its ability to be integrated into a picture archiving and communication system that allows direct image analysis and quantification of bone loss, according to the study authors, led by Alexander Pfeil, MD, of Jena (Germany) University Hospital – Friedrich Schiller University.
The researchers used the computer-assisted diagnosis software to measure the metacarpal index (MCI) and its cortical thickness score (MCI T-score) in the metacarpal bones of 104 psoriatic arthritis (PsA) patients who fulfilled the CASPAR criteria. All patients were treated either with nonsteroidal anti-inflammatory drugs or disease-modifying antirheumatic drugs (Arthritis Res Ther. 2016;18:248. doi: 10.1186/s13075-016-1145-4).
In the total PsA cohort, the MCI T-score showed a significantly reduced negative value of –1.289. “The reduced MCI T-score was clearly associated with a reduced bone mineral density of the metacarpal bones in PsA,” the investigators wrote.
For all scores, the researchers found a severity-dependent reduction for the BoneXpert parameters of MCI, MCI T-score, T, and Bone Health Index.
The strongest reductions were seen for MCI and T using the Proliferation Score (MCI: –28.3%; T: –31.9%) and the Destruction Score (MCI: –30.8%; T: –30.9%) of the Psoriatic Arthritis Ratingen Score.
A reduced MCI and T-score was directly associated with cortical thinning and the periarticular demineralization of the metacarpal bones, and highlighted a direct association with bone destruction and bone proliferation in PsA, the investigators said.
“The measurement of periarticular bone loss can be considered a complementary approach to verify PsA-related bony changes and a surrogate marker for PsA progression,” the researchers suggested.
The technique’s high reproducibility can also be used to optimize an appropriate individual therapeutic strategy, they added.
The study had no specific funding source, and the authors declared no conflicts of interest.
An advanced computer assisted digital x-ray radiogrammetry technique that measures bone thickness has the potential to be a surrogate marker of radiographic progression in psoriatic arthritis, according to a report in Arthritis Research & Therapy.
The method uses software called BoneXpert to sensitively differentiate between the different stages of disease manifestation affecting bone integrity. Digital x-ray radiogrammetry (DXR) with BoneXpert has a clinical advantage over standard techniques such as radiographs through its ability to be integrated into a picture archiving and communication system that allows direct image analysis and quantification of bone loss, according to the study authors, led by Alexander Pfeil, MD, of Jena (Germany) University Hospital – Friedrich Schiller University.
The researchers used the computer-assisted diagnosis software to measure the metacarpal index (MCI) and its cortical thickness score (MCI T-score) in the metacarpal bones of 104 psoriatic arthritis (PsA) patients who fulfilled the CASPAR criteria. All patients were treated either with nonsteroidal anti-inflammatory drugs or disease-modifying antirheumatic drugs (Arthritis Res Ther. 2016;18:248. doi: 10.1186/s13075-016-1145-4).
In the total PsA cohort, the MCI T-score showed a significantly reduced negative value of –1.289. “The reduced MCI T-score was clearly associated with a reduced bone mineral density of the metacarpal bones in PsA,” the investigators wrote.
For all scores, the researchers found a severity-dependent reduction for the BoneXpert parameters of MCI, MCI T-score, T, and Bone Health Index.
The strongest reductions were seen for MCI and T using the Proliferation Score (MCI: –28.3%; T: –31.9%) and the Destruction Score (MCI: –30.8%; T: –30.9%) of the Psoriatic Arthritis Ratingen Score.
A reduced MCI and T-score was directly associated with cortical thinning and the periarticular demineralization of the metacarpal bones, and highlighted a direct association with bone destruction and bone proliferation in PsA, the investigators said.
“The measurement of periarticular bone loss can be considered a complementary approach to verify PsA-related bony changes and a surrogate marker for PsA progression,” the researchers suggested.
The technique’s high reproducibility can also be used to optimize an appropriate individual therapeutic strategy, they added.
The study had no specific funding source, and the authors declared no conflicts of interest.
FROM ARTHRITIS RESEARCH & THERAPY
Key clinical point:
Main finding: In the total PsA cohort, the MCI T-score showed a significantly reduced negative value of –1.289.
Data source: A cohort of 104 PsA patients fulfilling the CASPAR criteria who were taking nonsteroidal inflammatory drugs or disease-modifying antirheumatic drugs.
Disclosures: The study had no specific funding source, and the authors declared no conflicts of interest.
Alemtuzumab Reduces Preexisting MS Disability
In addition to slowing disability accumulation, alemtuzumab improves preexisting disability in patients with relapsing-remitting multiple sclerosis (MS) who have had inadequate responses to prior therapies, according to research published online ahead of print October 12 in Neurology. “Disabilities may often be reversible (at least partially) in patients with active relapsing-remitting MS if they receive suitable therapy, irrespective of the type of baseline functional deficit,” said Gavin Giovannoni, MD, PhD, Professor of Neurology at Barts and the London School of Medicine and Dentistry, and colleagues.
Most currently approved therapies for relapsing-remitting MS delay confirmed disability worsening, compared with placebo. The introduction of more potent drugs in recent years, however, has made the goal of confirmed disability improvement (CDI) appear more feasible. In the CARE-MS II (Comparison of Alemtuzumab and Rebif Efficacy in MS II) trial, CDI was more likely among patients receiving alemtuzumab than among those receiving interferon beta-1a.
An Analysis of CARE-MS II Data
Dr. Giovannoni and colleagues examined prespecified and post hoc disability outcomes of CARE-MS II to characterize alemtuzumab’s effect on preexisting disability. In the trial, patients with relapsing-remitting MS with inadequate response to prior disease-modifying therapies were randomized to alemtuzumab or subcutaneous interferon beta-1a. Patients randomized to alemtuzumab received 12 mg/day of the treatment on five consecutive days at month 0, and on three consecutive days at month 12. Participants randomized to interferon received 44 mg three times weekly. The study lasted for two years.
Blinded raters performed Expanded Disability Status Scale (EDSS) assessments at baseline, every three months, and when relapse was suspected. They administered the MS Functional Composite (MSFC) three times before baseline to attenuate practice effects, and then every six months. Finally, they assessed visual function every six months with the binocular Sloan low-contrast letter acuity (SLCLA) test.
Dr. Giovannoni and colleagues assessed four tertiary end points of CARE-MS II. The first was time to CDI, which was defined as a decrease of one or more points in EDSS from baseline sustained for three or more or six or more months in patients with a baseline score of 2 or greater. The second was the proportion worsened (increase of 0.5 or more points), stable, or improved (decrease of 0.5 or more points) from baseline EDSS. The third was mean change from baseline in MSFC and MSFC plus SLCLA scores and their components. The fourth was proportions worsened (decrease of 0.5 or more standard deviations), stable, or improved (increase of 0.5 or more standard deviations) from baseline MSFC scores.
Results Consistently Favored Alemtuzumab
In all, 202 patients were randomized to interferon beta-1a, and 426 patients were randomized to alemtuzumab. Baseline demographic and clinical characteristics were similar between treatment groups. The groups had comparable percentages of patients with recent prestudy relapse.
At month 24, EDSS improvement, as well as improvement in all seven EDSS functional systems, was more common among patients receiving alemtuzumab, compared with those receiving interferon. Participants receiving alemtuzumab were more than twice as likely as those receiving interferon to have three-month CDI. Among patients with a baseline EDSS of 3 or higher, the proportion of patients with six-month CDI was also significantly greater with alemtuzumab than with interferon. Stratification of results by presence or absence of prior interferon use did not affect the results, nor did stratification by presence or absence of relapse within three months before initiating treatment.
In addition, the likelihood of six-month CDI in MSFC score from baseline to month 24 was greater for patients receiving alemtuzumab than those receiving interferon. Participants in the interferon group were significantly more likely than those in the alemtuzumab group to have 15% or greater worsening in MSFC sustained for six months. The difference between treatment groups in 20% or greater worsening in MSFC sustained for six months was not statistically significant.
At months 12 and 24, visual acuity in patients receiving alemtuzumab was stable at 2.5% contrast and at 100% contrast, but the results were not statistically significant. Participants receiving interferon had a significant decline in visual acuity from baseline to month 12 and from baseline to month 24 at 1.25% contrast and 2.5% contrast. Visual acuity declined significantly from baseline to month 24 in the interferon group at 100% contrast. Differences between treatment groups were significant at 2.5% contrast at months 12 and 24, and at 1.25% contrast at month 12.
Structural or Functional Repair?
“Giovannoni et al demonstrated that comparing two drugs for their efficacy on disability progression omits a crucial aspect of the MS disease process: sustained reduction in disability,” said Bibiana Bielekova, MD, investigator at the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland, and Mar Tintoré, MD, PhD, neurologist at the MS Centre of Catalonia in Barcelona, in an accompanying editorial. “Comparing both sides of the disability changes [ie, disability progression and disability reduction] between the two drugs doubles the amount of clinically useful information.”
The CARE-MS II design, however, may artificially overestimate the benefit of alemtuzumab over interferon, they added. More than 50% of enrolled patients were previously treated with interferon beta-1a, and the inclusion criteria required the presence of relapses while on this therapy. These factors “technically excluded patients who had optimal therapeutic response to interferon beta-1a,” said Drs. Bielekova and Tintoré. “Nevertheless, a similar observation was seen in treatment-naive patients with relapsing-remitting MS in the CAMMS223 phase II trial.”
Dr. Giovannoni’s group ruled out, to an extent, the possibility that disability improvements resulted solely from the reversal of exacerbation-related disability. Similarly, the observed sustained reduction in disability likely did not simply reflect measurement variance, because the results on various outcomes consistently favored alemtuzumab. “One can only speculate whether the sustained reduction in disability is due to structural repair (ie, remyelination) or functional repair (ie, plasticity, such as formation of new synapses). We favor the latter idea, based on the early experience with CD52-depleting antibody,” said Drs. Bielekova and Tintoré.
“Despite unarguable progress in MS therapeutics, there is still a long road ahead until we can eliminate disease progression for all patients,” they concluded.
—Erik Greb
Suggested Reading
Giovannoni G, Cohen JA, Coles AJ, et al. Alemtuzumab improves preexisting disability in active relapsing-remitting MS patients. Neurology. 2016 Oct 12 [Epub ahead of print].
Bielekova B, Tintore M. Sustained reduction of MS disability: New player in comparing disease-modifying treatments. Neurology. 2016 Oct 12 [Epub ahead of print].
In addition to slowing disability accumulation, alemtuzumab improves preexisting disability in patients with relapsing-remitting multiple sclerosis (MS) who have had inadequate responses to prior therapies, according to research published online ahead of print October 12 in Neurology. “Disabilities may often be reversible (at least partially) in patients with active relapsing-remitting MS if they receive suitable therapy, irrespective of the type of baseline functional deficit,” said Gavin Giovannoni, MD, PhD, Professor of Neurology at Barts and the London School of Medicine and Dentistry, and colleagues.
Most currently approved therapies for relapsing-remitting MS delay confirmed disability worsening, compared with placebo. The introduction of more potent drugs in recent years, however, has made the goal of confirmed disability improvement (CDI) appear more feasible. In the CARE-MS II (Comparison of Alemtuzumab and Rebif Efficacy in MS II) trial, CDI was more likely among patients receiving alemtuzumab than among those receiving interferon beta-1a.
An Analysis of CARE-MS II Data
Dr. Giovannoni and colleagues examined prespecified and post hoc disability outcomes of CARE-MS II to characterize alemtuzumab’s effect on preexisting disability. In the trial, patients with relapsing-remitting MS with inadequate response to prior disease-modifying therapies were randomized to alemtuzumab or subcutaneous interferon beta-1a. Patients randomized to alemtuzumab received 12 mg/day of the treatment on five consecutive days at month 0, and on three consecutive days at month 12. Participants randomized to interferon received 44 mg three times weekly. The study lasted for two years.
Blinded raters performed Expanded Disability Status Scale (EDSS) assessments at baseline, every three months, and when relapse was suspected. They administered the MS Functional Composite (MSFC) three times before baseline to attenuate practice effects, and then every six months. Finally, they assessed visual function every six months with the binocular Sloan low-contrast letter acuity (SLCLA) test.
Dr. Giovannoni and colleagues assessed four tertiary end points of CARE-MS II. The first was time to CDI, which was defined as a decrease of one or more points in EDSS from baseline sustained for three or more or six or more months in patients with a baseline score of 2 or greater. The second was the proportion worsened (increase of 0.5 or more points), stable, or improved (decrease of 0.5 or more points) from baseline EDSS. The third was mean change from baseline in MSFC and MSFC plus SLCLA scores and their components. The fourth was proportions worsened (decrease of 0.5 or more standard deviations), stable, or improved (increase of 0.5 or more standard deviations) from baseline MSFC scores.
Results Consistently Favored Alemtuzumab
In all, 202 patients were randomized to interferon beta-1a, and 426 patients were randomized to alemtuzumab. Baseline demographic and clinical characteristics were similar between treatment groups. The groups had comparable percentages of patients with recent prestudy relapse.
At month 24, EDSS improvement, as well as improvement in all seven EDSS functional systems, was more common among patients receiving alemtuzumab, compared with those receiving interferon. Participants receiving alemtuzumab were more than twice as likely as those receiving interferon to have three-month CDI. Among patients with a baseline EDSS of 3 or higher, the proportion of patients with six-month CDI was also significantly greater with alemtuzumab than with interferon. Stratification of results by presence or absence of prior interferon use did not affect the results, nor did stratification by presence or absence of relapse within three months before initiating treatment.
In addition, the likelihood of six-month CDI in MSFC score from baseline to month 24 was greater for patients receiving alemtuzumab than those receiving interferon. Participants in the interferon group were significantly more likely than those in the alemtuzumab group to have 15% or greater worsening in MSFC sustained for six months. The difference between treatment groups in 20% or greater worsening in MSFC sustained for six months was not statistically significant.
At months 12 and 24, visual acuity in patients receiving alemtuzumab was stable at 2.5% contrast and at 100% contrast, but the results were not statistically significant. Participants receiving interferon had a significant decline in visual acuity from baseline to month 12 and from baseline to month 24 at 1.25% contrast and 2.5% contrast. Visual acuity declined significantly from baseline to month 24 in the interferon group at 100% contrast. Differences between treatment groups were significant at 2.5% contrast at months 12 and 24, and at 1.25% contrast at month 12.
Structural or Functional Repair?
“Giovannoni et al demonstrated that comparing two drugs for their efficacy on disability progression omits a crucial aspect of the MS disease process: sustained reduction in disability,” said Bibiana Bielekova, MD, investigator at the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland, and Mar Tintoré, MD, PhD, neurologist at the MS Centre of Catalonia in Barcelona, in an accompanying editorial. “Comparing both sides of the disability changes [ie, disability progression and disability reduction] between the two drugs doubles the amount of clinically useful information.”
The CARE-MS II design, however, may artificially overestimate the benefit of alemtuzumab over interferon, they added. More than 50% of enrolled patients were previously treated with interferon beta-1a, and the inclusion criteria required the presence of relapses while on this therapy. These factors “technically excluded patients who had optimal therapeutic response to interferon beta-1a,” said Drs. Bielekova and Tintoré. “Nevertheless, a similar observation was seen in treatment-naive patients with relapsing-remitting MS in the CAMMS223 phase II trial.”
Dr. Giovannoni’s group ruled out, to an extent, the possibility that disability improvements resulted solely from the reversal of exacerbation-related disability. Similarly, the observed sustained reduction in disability likely did not simply reflect measurement variance, because the results on various outcomes consistently favored alemtuzumab. “One can only speculate whether the sustained reduction in disability is due to structural repair (ie, remyelination) or functional repair (ie, plasticity, such as formation of new synapses). We favor the latter idea, based on the early experience with CD52-depleting antibody,” said Drs. Bielekova and Tintoré.
“Despite unarguable progress in MS therapeutics, there is still a long road ahead until we can eliminate disease progression for all patients,” they concluded.
—Erik Greb
Suggested Reading
Giovannoni G, Cohen JA, Coles AJ, et al. Alemtuzumab improves preexisting disability in active relapsing-remitting MS patients. Neurology. 2016 Oct 12 [Epub ahead of print].
Bielekova B, Tintore M. Sustained reduction of MS disability: New player in comparing disease-modifying treatments. Neurology. 2016 Oct 12 [Epub ahead of print].
In addition to slowing disability accumulation, alemtuzumab improves preexisting disability in patients with relapsing-remitting multiple sclerosis (MS) who have had inadequate responses to prior therapies, according to research published online ahead of print October 12 in Neurology. “Disabilities may often be reversible (at least partially) in patients with active relapsing-remitting MS if they receive suitable therapy, irrespective of the type of baseline functional deficit,” said Gavin Giovannoni, MD, PhD, Professor of Neurology at Barts and the London School of Medicine and Dentistry, and colleagues.
Most currently approved therapies for relapsing-remitting MS delay confirmed disability worsening, compared with placebo. The introduction of more potent drugs in recent years, however, has made the goal of confirmed disability improvement (CDI) appear more feasible. In the CARE-MS II (Comparison of Alemtuzumab and Rebif Efficacy in MS II) trial, CDI was more likely among patients receiving alemtuzumab than among those receiving interferon beta-1a.
An Analysis of CARE-MS II Data
Dr. Giovannoni and colleagues examined prespecified and post hoc disability outcomes of CARE-MS II to characterize alemtuzumab’s effect on preexisting disability. In the trial, patients with relapsing-remitting MS with inadequate response to prior disease-modifying therapies were randomized to alemtuzumab or subcutaneous interferon beta-1a. Patients randomized to alemtuzumab received 12 mg/day of the treatment on five consecutive days at month 0, and on three consecutive days at month 12. Participants randomized to interferon received 44 mg three times weekly. The study lasted for two years.
Blinded raters performed Expanded Disability Status Scale (EDSS) assessments at baseline, every three months, and when relapse was suspected. They administered the MS Functional Composite (MSFC) three times before baseline to attenuate practice effects, and then every six months. Finally, they assessed visual function every six months with the binocular Sloan low-contrast letter acuity (SLCLA) test.
Dr. Giovannoni and colleagues assessed four tertiary end points of CARE-MS II. The first was time to CDI, which was defined as a decrease of one or more points in EDSS from baseline sustained for three or more or six or more months in patients with a baseline score of 2 or greater. The second was the proportion worsened (increase of 0.5 or more points), stable, or improved (decrease of 0.5 or more points) from baseline EDSS. The third was mean change from baseline in MSFC and MSFC plus SLCLA scores and their components. The fourth was proportions worsened (decrease of 0.5 or more standard deviations), stable, or improved (increase of 0.5 or more standard deviations) from baseline MSFC scores.
Results Consistently Favored Alemtuzumab
In all, 202 patients were randomized to interferon beta-1a, and 426 patients were randomized to alemtuzumab. Baseline demographic and clinical characteristics were similar between treatment groups. The groups had comparable percentages of patients with recent prestudy relapse.
At month 24, EDSS improvement, as well as improvement in all seven EDSS functional systems, was more common among patients receiving alemtuzumab, compared with those receiving interferon. Participants receiving alemtuzumab were more than twice as likely as those receiving interferon to have three-month CDI. Among patients with a baseline EDSS of 3 or higher, the proportion of patients with six-month CDI was also significantly greater with alemtuzumab than with interferon. Stratification of results by presence or absence of prior interferon use did not affect the results, nor did stratification by presence or absence of relapse within three months before initiating treatment.
In addition, the likelihood of six-month CDI in MSFC score from baseline to month 24 was greater for patients receiving alemtuzumab than those receiving interferon. Participants in the interferon group were significantly more likely than those in the alemtuzumab group to have 15% or greater worsening in MSFC sustained for six months. The difference between treatment groups in 20% or greater worsening in MSFC sustained for six months was not statistically significant.
At months 12 and 24, visual acuity in patients receiving alemtuzumab was stable at 2.5% contrast and at 100% contrast, but the results were not statistically significant. Participants receiving interferon had a significant decline in visual acuity from baseline to month 12 and from baseline to month 24 at 1.25% contrast and 2.5% contrast. Visual acuity declined significantly from baseline to month 24 in the interferon group at 100% contrast. Differences between treatment groups were significant at 2.5% contrast at months 12 and 24, and at 1.25% contrast at month 12.
Structural or Functional Repair?
“Giovannoni et al demonstrated that comparing two drugs for their efficacy on disability progression omits a crucial aspect of the MS disease process: sustained reduction in disability,” said Bibiana Bielekova, MD, investigator at the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland, and Mar Tintoré, MD, PhD, neurologist at the MS Centre of Catalonia in Barcelona, in an accompanying editorial. “Comparing both sides of the disability changes [ie, disability progression and disability reduction] between the two drugs doubles the amount of clinically useful information.”
The CARE-MS II design, however, may artificially overestimate the benefit of alemtuzumab over interferon, they added. More than 50% of enrolled patients were previously treated with interferon beta-1a, and the inclusion criteria required the presence of relapses while on this therapy. These factors “technically excluded patients who had optimal therapeutic response to interferon beta-1a,” said Drs. Bielekova and Tintoré. “Nevertheless, a similar observation was seen in treatment-naive patients with relapsing-remitting MS in the CAMMS223 phase II trial.”
Dr. Giovannoni’s group ruled out, to an extent, the possibility that disability improvements resulted solely from the reversal of exacerbation-related disability. Similarly, the observed sustained reduction in disability likely did not simply reflect measurement variance, because the results on various outcomes consistently favored alemtuzumab. “One can only speculate whether the sustained reduction in disability is due to structural repair (ie, remyelination) or functional repair (ie, plasticity, such as formation of new synapses). We favor the latter idea, based on the early experience with CD52-depleting antibody,” said Drs. Bielekova and Tintoré.
“Despite unarguable progress in MS therapeutics, there is still a long road ahead until we can eliminate disease progression for all patients,” they concluded.
—Erik Greb
Suggested Reading
Giovannoni G, Cohen JA, Coles AJ, et al. Alemtuzumab improves preexisting disability in active relapsing-remitting MS patients. Neurology. 2016 Oct 12 [Epub ahead of print].
Bielekova B, Tintore M. Sustained reduction of MS disability: New player in comparing disease-modifying treatments. Neurology. 2016 Oct 12 [Epub ahead of print].
Nonsteroidal Anti-inflammatory Drugs and Cardiovascular Risk: Where Are We Today?
- Historical Overview
- Mechanistic Basis for a Cardiovascular Hazard
- Evidence from Meta-Analyses
- Cardiovascular Risk
- Implications for Patient Management
Faculty/Faculty Disclosure:
Gary Rouff, MD
Clinical Professor of Family Medicine,
Department of Family Practice,
Michigan State University
College of Medicine, Director of Clinical
Research, Westside Family Medical Center
Kalamazoo, MI
Dr. Rouff discloses that he has no real or apparent conflict of interest to report
- Historical Overview
- Mechanistic Basis for a Cardiovascular Hazard
- Evidence from Meta-Analyses
- Cardiovascular Risk
- Implications for Patient Management
Faculty/Faculty Disclosure:
Gary Rouff, MD
Clinical Professor of Family Medicine,
Department of Family Practice,
Michigan State University
College of Medicine, Director of Clinical
Research, Westside Family Medical Center
Kalamazoo, MI
Dr. Rouff discloses that he has no real or apparent conflict of interest to report
- Historical Overview
- Mechanistic Basis for a Cardiovascular Hazard
- Evidence from Meta-Analyses
- Cardiovascular Risk
- Implications for Patient Management
Faculty/Faculty Disclosure:
Gary Rouff, MD
Clinical Professor of Family Medicine,
Department of Family Practice,
Michigan State University
College of Medicine, Director of Clinical
Research, Westside Family Medical Center
Kalamazoo, MI
Dr. Rouff discloses that he has no real or apparent conflict of interest to report
Using CHIMPS for type A dissection in a high-risk patient
Traditional open repair for type A aortic dissection in patients with Marfan syndrome and a previous cardiovascular surgery carries a high risk of morbidity and mortality, but a team of surgeons from China have reported on a hybrid technique that combines open and endovascular approaches to repair type A dissection in a patient with Marfan syndrome.
In the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152:1191-3), Hong-wei Zhang, MD, and colleagues from West China Hospital of Sichuan University, explained their technique using chimney and sandwich grafts to repair a type A dissection in the patient late after Bentall surgery. “With great advancements in recent thoracic endovascular aortic repair technology, innovative hybrid operations combining open and endovascular techniques hold promising potential to expand treatment options,” Dr. Zhang and coauthors said.
They reported on a 33-year-old male with Marfan syndrome (MFS) who had elective aortic root and mechanical valve replacement 10 years earlier. Three days of persistent chest and back pain caused the patient to go to the emergency department, where computed tomography angiography (CTA) confirmed a type A aortic dissection from the distal ascending aorta to the iliac arteries and involving the proximal innominate artery and left common carotid artery (LCCA).
Because the patient refused another open surgery, Dr. Zhang and colleagues executed their hybrid approach, the first step of which was to create an LCCA-left axillar artery bypass with a 6-mm Gore-Tex graft (W.L. Gore & Associates). After they led the graft through the costoclavicular passage, they introduced the first (distal) thoracic stent (Valiant Captivia, Medtronic) from the right femoral artery and deployed it at the proximal descending aorta. They then inserted the second (proximal) thoracic stent graft into the previous ascending synthetic graft.
Next, they delivered the chimney grafts, two Fluency Plus covered stents (Bard Peripheral Vascular), from the right brachial and innominate artery into the ascending graft. Then they delivered two more Fluency grafts from the LCCA into the endolumen of the first (distal) thoracic stent graft.
After they deployed the second (proximal) thoracic stent graft, they deployed the precisely positioned stent grafts from the innominate artery and the LCCA, sandwiching the covered stents for the LCCA between the two thoracic stent grafts. They then occluded the left subclavian artery with a 10-mm double-disk vascular occlude.
Upon angiography at completion, Dr. Zhang and coauthors found an endoleak from the overlap zones between the two thoracic stent grafts.
However, the patient’s postoperative course was uneventful, and CTA 5 days after surgery showed complete sealing of the primary entry tear with patent chimney and sandwich grafts. The patient remained symptom-free at 30 days, when CTA again confirmed patency of the supra-arch grafts.
Dr. Zhang and coauthors acknowledge that carotid-to-carotid bypass could have been an alternative in order to use fewer stent grafts and to reduce the risk of endoleaks in this case, but they opted for this approach because of the dissection of the proximal innominate artery and LCCA and their concern of the long-term patency of a carotid-to-carotid bypass. “To our knowledge, this is the first reported case of a hybrid treatment for new-onset, type A aortic dissection in patients with MFS with a previous Bentall procedure,” Dr. Zhang and coauthors said. “Although further staged repairs are required in our case, this endovascular technique could be an effective and life-saving treatment option for the high-risk repeated surgical patients with MFS.”
Dr. Zhang and coauthors had no financial relationships to disclose.
In their invited commentary, Lars Svensson, MD, PhD, Matthew Eagleton, MD, and Eric Roselli, MD, of the Cleveland Clinic, said the approach Dr. Zhang and colleagues reported on is one of the “novel” endovascular CHIMPS methods for aortic arch repair – CHIMPS meaning chimneys, periscopes, snorkels, and sandwiches (J Thorac Cardiovasc Surg. 2016;152:958-9). But they noted that one of the ongoing challenges with these types of parallel grafts is the gutter leaks that occur between the sandwich grafts.
The commentators noted that CHIMPS procedures are easier alternatives to using spiral branch graft stents for the thoracoabdominal aorta or direct-connecting branch stems from an aortic stent in the arch, but they added, “An important caveat is that the blood supply maintenance and long-term durability may not be adequate.”
The patient Dr. Zhang and colleagues reported on “is young and will need a durable operation,” Dr. Svensson, Dr. Eagleton, and Dr. Roselli said. “Unfortunately, in our experience over time we have observed that these CHIMPS procedures tend to break down and leak into the arch, including the arch actually rupturing,” they said. These patients will need “intensive” monitoring. What’s more, patients with Marfan syndrome are prone to aneurysm formation “and are not good candidates for stenting,” the commentators said.
“Nevertheless, further engineering iterations of CHIMPS may address the problem with gutter leaks and become an alternative to the elephant trunk procedure for those patients who are at particularly high risk,” the commentators said.
Dr. Svensson disclosed he holds a patent with potential royalties for an aortic valve and aortic root stent graft with connecting branch grafts to the coronary ostia. Dr. Roselli is a consultant and investigator for Bolton, Gore, and Medtronic. Dr. Eagleton has no relationships to disclose.
In their invited commentary, Lars Svensson, MD, PhD, Matthew Eagleton, MD, and Eric Roselli, MD, of the Cleveland Clinic, said the approach Dr. Zhang and colleagues reported on is one of the “novel” endovascular CHIMPS methods for aortic arch repair – CHIMPS meaning chimneys, periscopes, snorkels, and sandwiches (J Thorac Cardiovasc Surg. 2016;152:958-9). But they noted that one of the ongoing challenges with these types of parallel grafts is the gutter leaks that occur between the sandwich grafts.
The commentators noted that CHIMPS procedures are easier alternatives to using spiral branch graft stents for the thoracoabdominal aorta or direct-connecting branch stems from an aortic stent in the arch, but they added, “An important caveat is that the blood supply maintenance and long-term durability may not be adequate.”
The patient Dr. Zhang and colleagues reported on “is young and will need a durable operation,” Dr. Svensson, Dr. Eagleton, and Dr. Roselli said. “Unfortunately, in our experience over time we have observed that these CHIMPS procedures tend to break down and leak into the arch, including the arch actually rupturing,” they said. These patients will need “intensive” monitoring. What’s more, patients with Marfan syndrome are prone to aneurysm formation “and are not good candidates for stenting,” the commentators said.
“Nevertheless, further engineering iterations of CHIMPS may address the problem with gutter leaks and become an alternative to the elephant trunk procedure for those patients who are at particularly high risk,” the commentators said.
Dr. Svensson disclosed he holds a patent with potential royalties for an aortic valve and aortic root stent graft with connecting branch grafts to the coronary ostia. Dr. Roselli is a consultant and investigator for Bolton, Gore, and Medtronic. Dr. Eagleton has no relationships to disclose.
In their invited commentary, Lars Svensson, MD, PhD, Matthew Eagleton, MD, and Eric Roselli, MD, of the Cleveland Clinic, said the approach Dr. Zhang and colleagues reported on is one of the “novel” endovascular CHIMPS methods for aortic arch repair – CHIMPS meaning chimneys, periscopes, snorkels, and sandwiches (J Thorac Cardiovasc Surg. 2016;152:958-9). But they noted that one of the ongoing challenges with these types of parallel grafts is the gutter leaks that occur between the sandwich grafts.
The commentators noted that CHIMPS procedures are easier alternatives to using spiral branch graft stents for the thoracoabdominal aorta or direct-connecting branch stems from an aortic stent in the arch, but they added, “An important caveat is that the blood supply maintenance and long-term durability may not be adequate.”
The patient Dr. Zhang and colleagues reported on “is young and will need a durable operation,” Dr. Svensson, Dr. Eagleton, and Dr. Roselli said. “Unfortunately, in our experience over time we have observed that these CHIMPS procedures tend to break down and leak into the arch, including the arch actually rupturing,” they said. These patients will need “intensive” monitoring. What’s more, patients with Marfan syndrome are prone to aneurysm formation “and are not good candidates for stenting,” the commentators said.
“Nevertheless, further engineering iterations of CHIMPS may address the problem with gutter leaks and become an alternative to the elephant trunk procedure for those patients who are at particularly high risk,” the commentators said.
Dr. Svensson disclosed he holds a patent with potential royalties for an aortic valve and aortic root stent graft with connecting branch grafts to the coronary ostia. Dr. Roselli is a consultant and investigator for Bolton, Gore, and Medtronic. Dr. Eagleton has no relationships to disclose.
Traditional open repair for type A aortic dissection in patients with Marfan syndrome and a previous cardiovascular surgery carries a high risk of morbidity and mortality, but a team of surgeons from China have reported on a hybrid technique that combines open and endovascular approaches to repair type A dissection in a patient with Marfan syndrome.
In the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152:1191-3), Hong-wei Zhang, MD, and colleagues from West China Hospital of Sichuan University, explained their technique using chimney and sandwich grafts to repair a type A dissection in the patient late after Bentall surgery. “With great advancements in recent thoracic endovascular aortic repair technology, innovative hybrid operations combining open and endovascular techniques hold promising potential to expand treatment options,” Dr. Zhang and coauthors said.
They reported on a 33-year-old male with Marfan syndrome (MFS) who had elective aortic root and mechanical valve replacement 10 years earlier. Three days of persistent chest and back pain caused the patient to go to the emergency department, where computed tomography angiography (CTA) confirmed a type A aortic dissection from the distal ascending aorta to the iliac arteries and involving the proximal innominate artery and left common carotid artery (LCCA).
Because the patient refused another open surgery, Dr. Zhang and colleagues executed their hybrid approach, the first step of which was to create an LCCA-left axillar artery bypass with a 6-mm Gore-Tex graft (W.L. Gore & Associates). After they led the graft through the costoclavicular passage, they introduced the first (distal) thoracic stent (Valiant Captivia, Medtronic) from the right femoral artery and deployed it at the proximal descending aorta. They then inserted the second (proximal) thoracic stent graft into the previous ascending synthetic graft.
Next, they delivered the chimney grafts, two Fluency Plus covered stents (Bard Peripheral Vascular), from the right brachial and innominate artery into the ascending graft. Then they delivered two more Fluency grafts from the LCCA into the endolumen of the first (distal) thoracic stent graft.
After they deployed the second (proximal) thoracic stent graft, they deployed the precisely positioned stent grafts from the innominate artery and the LCCA, sandwiching the covered stents for the LCCA between the two thoracic stent grafts. They then occluded the left subclavian artery with a 10-mm double-disk vascular occlude.
Upon angiography at completion, Dr. Zhang and coauthors found an endoleak from the overlap zones between the two thoracic stent grafts.
However, the patient’s postoperative course was uneventful, and CTA 5 days after surgery showed complete sealing of the primary entry tear with patent chimney and sandwich grafts. The patient remained symptom-free at 30 days, when CTA again confirmed patency of the supra-arch grafts.
Dr. Zhang and coauthors acknowledge that carotid-to-carotid bypass could have been an alternative in order to use fewer stent grafts and to reduce the risk of endoleaks in this case, but they opted for this approach because of the dissection of the proximal innominate artery and LCCA and their concern of the long-term patency of a carotid-to-carotid bypass. “To our knowledge, this is the first reported case of a hybrid treatment for new-onset, type A aortic dissection in patients with MFS with a previous Bentall procedure,” Dr. Zhang and coauthors said. “Although further staged repairs are required in our case, this endovascular technique could be an effective and life-saving treatment option for the high-risk repeated surgical patients with MFS.”
Dr. Zhang and coauthors had no financial relationships to disclose.
Traditional open repair for type A aortic dissection in patients with Marfan syndrome and a previous cardiovascular surgery carries a high risk of morbidity and mortality, but a team of surgeons from China have reported on a hybrid technique that combines open and endovascular approaches to repair type A dissection in a patient with Marfan syndrome.
In the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152:1191-3), Hong-wei Zhang, MD, and colleagues from West China Hospital of Sichuan University, explained their technique using chimney and sandwich grafts to repair a type A dissection in the patient late after Bentall surgery. “With great advancements in recent thoracic endovascular aortic repair technology, innovative hybrid operations combining open and endovascular techniques hold promising potential to expand treatment options,” Dr. Zhang and coauthors said.
They reported on a 33-year-old male with Marfan syndrome (MFS) who had elective aortic root and mechanical valve replacement 10 years earlier. Three days of persistent chest and back pain caused the patient to go to the emergency department, where computed tomography angiography (CTA) confirmed a type A aortic dissection from the distal ascending aorta to the iliac arteries and involving the proximal innominate artery and left common carotid artery (LCCA).
Because the patient refused another open surgery, Dr. Zhang and colleagues executed their hybrid approach, the first step of which was to create an LCCA-left axillar artery bypass with a 6-mm Gore-Tex graft (W.L. Gore & Associates). After they led the graft through the costoclavicular passage, they introduced the first (distal) thoracic stent (Valiant Captivia, Medtronic) from the right femoral artery and deployed it at the proximal descending aorta. They then inserted the second (proximal) thoracic stent graft into the previous ascending synthetic graft.
Next, they delivered the chimney grafts, two Fluency Plus covered stents (Bard Peripheral Vascular), from the right brachial and innominate artery into the ascending graft. Then they delivered two more Fluency grafts from the LCCA into the endolumen of the first (distal) thoracic stent graft.
After they deployed the second (proximal) thoracic stent graft, they deployed the precisely positioned stent grafts from the innominate artery and the LCCA, sandwiching the covered stents for the LCCA between the two thoracic stent grafts. They then occluded the left subclavian artery with a 10-mm double-disk vascular occlude.
Upon angiography at completion, Dr. Zhang and coauthors found an endoleak from the overlap zones between the two thoracic stent grafts.
However, the patient’s postoperative course was uneventful, and CTA 5 days after surgery showed complete sealing of the primary entry tear with patent chimney and sandwich grafts. The patient remained symptom-free at 30 days, when CTA again confirmed patency of the supra-arch grafts.
Dr. Zhang and coauthors acknowledge that carotid-to-carotid bypass could have been an alternative in order to use fewer stent grafts and to reduce the risk of endoleaks in this case, but they opted for this approach because of the dissection of the proximal innominate artery and LCCA and their concern of the long-term patency of a carotid-to-carotid bypass. “To our knowledge, this is the first reported case of a hybrid treatment for new-onset, type A aortic dissection in patients with MFS with a previous Bentall procedure,” Dr. Zhang and coauthors said. “Although further staged repairs are required in our case, this endovascular technique could be an effective and life-saving treatment option for the high-risk repeated surgical patients with MFS.”
Dr. Zhang and coauthors had no financial relationships to disclose.
FROM THE JOURNAL OF THORACIC AND CARDIOVASCULAR SURGERY
Key clinical point: Chimney and sandwich grafts facilitate hybrid repair of type A aortic dissection for a Marfan syndrome patient after Bentall surgery.
Major finding: A 33-year-old male with Marfan syndrome and a history of cardiac surgery was asymptomatic 30 days after hybrid repair for type A aortic dissection.
Data source: Case report of single patient at an academic medical center.
Disclosures: Dr. Zhang and coauthors reported having no financial disclosures.
Newborns with CHD have reduced cerebral oxygen delivery
Using a newer form of MRI to investigate oxygen levels in newborns with congenital heart disease, researchers in Canada reported that these patients may have impaired brain growth and development in the first weeks of life because of significantly lower cerebral oxygen delivery levels.
These findings suggest that oxygen delivery may impact brain growth, particularly in newborns with single-ventricle physiology, reported Jessie Mei Lim, BSc, of the University of Toronto, and her colleagues from McGill University, Montreal, and the Hospital for Sick Children, Toronto. The findings were published in the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152:1095-103). Ms. Lim and her colleagues used cine phase-contrast (PC) MRI to measure cerebral blood flow in newborns with congenital heard disease (CHD). Previous studies used optical measures of tissue oxygenation and MRI arterial spin labeling to suggests that newborns with severe CHD have impaired CBF and cerebral oxygen delivery (CDO2) and CBF.
This single-center study involved 63 newborns from June 2013 to April 2015 at the Hospital for Sick Children. These subjects received an MRI of the head before surgery at an average of age 7.5 days. The scans were done without sedation or contrast while the infants were asleep. The study compared 31 age-matched controls with 32 subjects with various forms of CHD – 12 were managed surgically along a single-ventricle pathway (SVP), 4 had coarctation of the aorta, 13 had transposition of the great arteries (TGA), and 3 had other forms of CHD.
The researchers validated their method by reporting similarities between flows in the basilar and vertebral arteries in 14 controls, “suggesting good consistency and accuracy of our method for measuring CBF,” Ms. Lim and her coauthors noted. A comparison of CBF measured with an unpaired Student t test revealed no significant differences between the CHD group and controls. The average net CBF in CHD patients was 103.5 mL/min vs. 119.7 mL/min in controls.
However, when evaluating CDO2 using a Student t test, the researchers found significantly lower levels in the CHD group – an average of 1,1881 mLO2/min. vs. 2,712 mL O2/min in controls (P less than .0001). And when the researchers indexed CDO2 to brain volume yielding indexed oxygen delivery, the difference between the two groups was still significant: an average of 523.1 mL O2/min-1 .100 g-1 in the CHD group and 685.6 mL O2/min-1.100 g-1 in controls (P = .0006).
Among the CHD group, those with SVP and TGA had significantly lower CDO2 than that of controls. Brain volumes were also lower in those with CHD (mean of 338.5 mL vs. 377.7 mL in controls, P = .002).
The MRI findings were telling in the study population, Ms. Lim and her coauthors said. Five subjects in the CHD group had a combination of diffuse excessive high-signal intensity (DEHSI) and white-matter injury (WMI), 10 had an isolated finding of DEHSI, two had WMI alone and five others had other minor brain abnormalities. But the control group had no abnormal findings on conventional brain MRI.
The researchers acknowledged that, while the impact of reduced cerebral oxygen delivery is unknown, “theoretical reasons for thinking it might adversely impact ongoing brain growth and development during this period of rapid brain growth are considered.”
Cardiovascular surgeons should consider these findings when deciding on when to operate on newborns with CHD, the researchers said. “Further support for the concept that such a mechanism could lead to irreversible deficits in brain growth and development might result in attempts to expedite surgical repair of congenital cardiac lesions, which have conventionally not been addressed in the neonatal period,” they wrote.
Ms. Lim and her coauthors had no financial relationships to disclose.
Congenital heart disease (CHD) is heterogeneous and different types of lesions may cause different hemodynamics, Caitlin K. Rollins, MD, of Boston Children’s Hospital and Harvard Medical School said in her invited commentary (J Thorac Cardiovasc Surg. 2016;152-960-1).
Ms. Lim and her colleagues in this study confirmed that premise with their finding that newborns with CHD and controls had similar cerebral blood flow, but that those with CHD had reduced oxygen delivery. “These differences were most apparent in the neonates with single-ventricle physiology and transposition of the great arteries,” Dr. Rollins said. The study authors’ finding of an association between reduced oxygen delivery and impaired brain development, along with this group’s previous reports (Circulation 2015;131:1313-23) suggesting preserved cerebral blood flow in the late prenatal period, differ from other studies using traditional methods to show reduced cerebral blood flow in obstructive left-sided lesions, Dr. Rollins said. “Although technical differences may in part account for the discrepancy, the contrasting results also reflect that the relative contributions of abnormal cerebral blood flow and oxygenation differ among forms of CHD,” Dr. Rollins said.
Congenital heart disease (CHD) is heterogeneous and different types of lesions may cause different hemodynamics, Caitlin K. Rollins, MD, of Boston Children’s Hospital and Harvard Medical School said in her invited commentary (J Thorac Cardiovasc Surg. 2016;152-960-1).
Ms. Lim and her colleagues in this study confirmed that premise with their finding that newborns with CHD and controls had similar cerebral blood flow, but that those with CHD had reduced oxygen delivery. “These differences were most apparent in the neonates with single-ventricle physiology and transposition of the great arteries,” Dr. Rollins said. The study authors’ finding of an association between reduced oxygen delivery and impaired brain development, along with this group’s previous reports (Circulation 2015;131:1313-23) suggesting preserved cerebral blood flow in the late prenatal period, differ from other studies using traditional methods to show reduced cerebral blood flow in obstructive left-sided lesions, Dr. Rollins said. “Although technical differences may in part account for the discrepancy, the contrasting results also reflect that the relative contributions of abnormal cerebral blood flow and oxygenation differ among forms of CHD,” Dr. Rollins said.
Congenital heart disease (CHD) is heterogeneous and different types of lesions may cause different hemodynamics, Caitlin K. Rollins, MD, of Boston Children’s Hospital and Harvard Medical School said in her invited commentary (J Thorac Cardiovasc Surg. 2016;152-960-1).
Ms. Lim and her colleagues in this study confirmed that premise with their finding that newborns with CHD and controls had similar cerebral blood flow, but that those with CHD had reduced oxygen delivery. “These differences were most apparent in the neonates with single-ventricle physiology and transposition of the great arteries,” Dr. Rollins said. The study authors’ finding of an association between reduced oxygen delivery and impaired brain development, along with this group’s previous reports (Circulation 2015;131:1313-23) suggesting preserved cerebral blood flow in the late prenatal period, differ from other studies using traditional methods to show reduced cerebral blood flow in obstructive left-sided lesions, Dr. Rollins said. “Although technical differences may in part account for the discrepancy, the contrasting results also reflect that the relative contributions of abnormal cerebral blood flow and oxygenation differ among forms of CHD,” Dr. Rollins said.
Using a newer form of MRI to investigate oxygen levels in newborns with congenital heart disease, researchers in Canada reported that these patients may have impaired brain growth and development in the first weeks of life because of significantly lower cerebral oxygen delivery levels.
These findings suggest that oxygen delivery may impact brain growth, particularly in newborns with single-ventricle physiology, reported Jessie Mei Lim, BSc, of the University of Toronto, and her colleagues from McGill University, Montreal, and the Hospital for Sick Children, Toronto. The findings were published in the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152:1095-103). Ms. Lim and her colleagues used cine phase-contrast (PC) MRI to measure cerebral blood flow in newborns with congenital heard disease (CHD). Previous studies used optical measures of tissue oxygenation and MRI arterial spin labeling to suggests that newborns with severe CHD have impaired CBF and cerebral oxygen delivery (CDO2) and CBF.
This single-center study involved 63 newborns from June 2013 to April 2015 at the Hospital for Sick Children. These subjects received an MRI of the head before surgery at an average of age 7.5 days. The scans were done without sedation or contrast while the infants were asleep. The study compared 31 age-matched controls with 32 subjects with various forms of CHD – 12 were managed surgically along a single-ventricle pathway (SVP), 4 had coarctation of the aorta, 13 had transposition of the great arteries (TGA), and 3 had other forms of CHD.
The researchers validated their method by reporting similarities between flows in the basilar and vertebral arteries in 14 controls, “suggesting good consistency and accuracy of our method for measuring CBF,” Ms. Lim and her coauthors noted. A comparison of CBF measured with an unpaired Student t test revealed no significant differences between the CHD group and controls. The average net CBF in CHD patients was 103.5 mL/min vs. 119.7 mL/min in controls.
However, when evaluating CDO2 using a Student t test, the researchers found significantly lower levels in the CHD group – an average of 1,1881 mLO2/min. vs. 2,712 mL O2/min in controls (P less than .0001). And when the researchers indexed CDO2 to brain volume yielding indexed oxygen delivery, the difference between the two groups was still significant: an average of 523.1 mL O2/min-1 .100 g-1 in the CHD group and 685.6 mL O2/min-1.100 g-1 in controls (P = .0006).
Among the CHD group, those with SVP and TGA had significantly lower CDO2 than that of controls. Brain volumes were also lower in those with CHD (mean of 338.5 mL vs. 377.7 mL in controls, P = .002).
The MRI findings were telling in the study population, Ms. Lim and her coauthors said. Five subjects in the CHD group had a combination of diffuse excessive high-signal intensity (DEHSI) and white-matter injury (WMI), 10 had an isolated finding of DEHSI, two had WMI alone and five others had other minor brain abnormalities. But the control group had no abnormal findings on conventional brain MRI.
The researchers acknowledged that, while the impact of reduced cerebral oxygen delivery is unknown, “theoretical reasons for thinking it might adversely impact ongoing brain growth and development during this period of rapid brain growth are considered.”
Cardiovascular surgeons should consider these findings when deciding on when to operate on newborns with CHD, the researchers said. “Further support for the concept that such a mechanism could lead to irreversible deficits in brain growth and development might result in attempts to expedite surgical repair of congenital cardiac lesions, which have conventionally not been addressed in the neonatal period,” they wrote.
Ms. Lim and her coauthors had no financial relationships to disclose.
Using a newer form of MRI to investigate oxygen levels in newborns with congenital heart disease, researchers in Canada reported that these patients may have impaired brain growth and development in the first weeks of life because of significantly lower cerebral oxygen delivery levels.
These findings suggest that oxygen delivery may impact brain growth, particularly in newborns with single-ventricle physiology, reported Jessie Mei Lim, BSc, of the University of Toronto, and her colleagues from McGill University, Montreal, and the Hospital for Sick Children, Toronto. The findings were published in the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152:1095-103). Ms. Lim and her colleagues used cine phase-contrast (PC) MRI to measure cerebral blood flow in newborns with congenital heard disease (CHD). Previous studies used optical measures of tissue oxygenation and MRI arterial spin labeling to suggests that newborns with severe CHD have impaired CBF and cerebral oxygen delivery (CDO2) and CBF.
This single-center study involved 63 newborns from June 2013 to April 2015 at the Hospital for Sick Children. These subjects received an MRI of the head before surgery at an average of age 7.5 days. The scans were done without sedation or contrast while the infants were asleep. The study compared 31 age-matched controls with 32 subjects with various forms of CHD – 12 were managed surgically along a single-ventricle pathway (SVP), 4 had coarctation of the aorta, 13 had transposition of the great arteries (TGA), and 3 had other forms of CHD.
The researchers validated their method by reporting similarities between flows in the basilar and vertebral arteries in 14 controls, “suggesting good consistency and accuracy of our method for measuring CBF,” Ms. Lim and her coauthors noted. A comparison of CBF measured with an unpaired Student t test revealed no significant differences between the CHD group and controls. The average net CBF in CHD patients was 103.5 mL/min vs. 119.7 mL/min in controls.
However, when evaluating CDO2 using a Student t test, the researchers found significantly lower levels in the CHD group – an average of 1,1881 mLO2/min. vs. 2,712 mL O2/min in controls (P less than .0001). And when the researchers indexed CDO2 to brain volume yielding indexed oxygen delivery, the difference between the two groups was still significant: an average of 523.1 mL O2/min-1 .100 g-1 in the CHD group and 685.6 mL O2/min-1.100 g-1 in controls (P = .0006).
Among the CHD group, those with SVP and TGA had significantly lower CDO2 than that of controls. Brain volumes were also lower in those with CHD (mean of 338.5 mL vs. 377.7 mL in controls, P = .002).
The MRI findings were telling in the study population, Ms. Lim and her coauthors said. Five subjects in the CHD group had a combination of diffuse excessive high-signal intensity (DEHSI) and white-matter injury (WMI), 10 had an isolated finding of DEHSI, two had WMI alone and five others had other minor brain abnormalities. But the control group had no abnormal findings on conventional brain MRI.
The researchers acknowledged that, while the impact of reduced cerebral oxygen delivery is unknown, “theoretical reasons for thinking it might adversely impact ongoing brain growth and development during this period of rapid brain growth are considered.”
Cardiovascular surgeons should consider these findings when deciding on when to operate on newborns with CHD, the researchers said. “Further support for the concept that such a mechanism could lead to irreversible deficits in brain growth and development might result in attempts to expedite surgical repair of congenital cardiac lesions, which have conventionally not been addressed in the neonatal period,” they wrote.
Ms. Lim and her coauthors had no financial relationships to disclose.
FROM THE JOURNAL OF THORACIC AND CARDIOVASCULAR SURGERY
Key clinical point: Cerebral blood flow is maintained but cerebral oxygen delivery is decreased in preoperative newborns with cyanotic congenital heart disease (CHD).
Major finding: Average cerebral oxygen delivery measured 1,1881 mLO2/min in the CHD group when measured with Student t testing vs. 2,712 mLO2/min in controls (P less than .0001).
Data source: Single-center study of 32 neonates with various forms of CHD 31 age-matched controls.
Disclosures: Ms. Lim and coauthors have no financial relationships to disclose.
Time to rethink bioprosthetic valve guidelines?
Recent findings on the incidence and pathophysiology of bioprosthetic valve thrombosis require revisiting existing guidelines against routine echocardiography in the first 5 years after bioprosthetic valve replacement and a longer course of anticoagulation therapy than the current standard of 3 months, investigators from the Mayo Clinic said in an expert opinion article in the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152;975-8).
In the expert commentary, Alexander C. Egbe, MBBS, of the departments of cardiovascular diseases and cardiovascular surgery at Mayo Clinic in Rochester, Minn., and coauthors explored the implications of their previous research, published in the Journal of the American College of Cardiology, that reported that bioprosthetic valve thrombosis (BPVT) is “not an uncommon cause of prosthetic valve dysfunction.” They identified BPVT in 46 of 397 (11%) bioprosthetic valves explanted at Mayo Clinic, and estimated the incidence of BPVT at 1% (J Am Coll Cardiol. 2015;66:2285-94), although Dr. Egbe and colleagues acknowledged the true incidence of BPVT is unknown, as is the time to occurrence. They noted that a different study design would be needed to determine that, along with the incidence of BPVT.
“The occurrence of BPVT is not restricted to surgically implanted bioprosthetic valves, but has also been observed after transcatheter aortic valve replacement (TAVR),” Dr. Egbe and colleagues said. They noted an association between BPVT and a lack of anticoagulation therapy in two earlier reports (N Engl J Med. 2015;373:2015-24; J Am Coll Cardiol. 2016;67:644-55). In their own study, 14 of 15 patients (93%) with diagnosed BPVT responded to anticoagulation therapy and avoided reoperation.
Dr. Egbe and coauthors did somewhat define the extent of the problem of misdiagnosis of BPVT. The diagnosis was considered in only 6 of 45 patients (13%) who had transesophageal echocardiography. “A significant proportion of the patients with BPVT were misdiagnosed as having structural failure and referred for reoperation,” Dr. Egbe and coauthors said. “This attests to the low level of awareness of the existence of BPVT and the lack of well-defined diagnostic criteria.”
They proposed a diagnostic model based on the echocardiography characteristics of three findings: a 50% increase in gradient within 5 years of implantation; increased cusp thickness; and abnormal cusp mobility. “The presence of all three echocardiographic features reliably diagnosed BPVT with a sensitivity of 72% and a specificity of 90%,” they said.
Their finding that 85% of BPVT cases occurred within 5 years of implantation flies in the face of clinical guidelines that state routine annual echocardiography is not recommended in that time frame (J Am So Echocardiogr. 2009;22;975-1014). But abnormal physical examination findings as a prerequisite for echocardiography may not be an effective method to diagnose BPVT. “In addition to transthoracic and transesophageal echocardiography, the use of other complementary imaging modalities, such as computed tomography, could be very effective in identifying subtle BPVT,” Dr. Egbe and colleagues said,
But preventing BPVT is more complicated. Clinical guidelines recommend anticoagulation of bioprosthetic valves for 3 months after implantation, but adhering to that guideline showed no protective effect against BPVT in their study, Dr. Egbe and coauthors said. Nor did antiplatelet therapy prove effective in preventing BPVT. However, a Danish study showed stopping anticoagulation within 6 months of surgical aortic valve replacement increased risk of thromboembolic complications and cardiovascular death (JAMA. 2012;308:2118-25). And the role of prosthesis type in BPVT “remains unclear.”
Dr. Egbe and coauthors acknowledged a number of questions persist with regard to BPVT in bioprosthetic valve dysfunction, including the true incidence, best screening method, risk factors, and the duration of anticoagulation, as well as the role of novel oral anticoagulants. “Answers to these questions will come from population-based prospective studies,” Dr. Egbe and colleagues said.
Dr. Egbe and his coauthors had no relationships to disclose.
Dr. Egbe and colleagues make a “provocative” case that it is the presence of thrombus on bioprosthetic valves, and not degeneration, that causes valve dysfunction, Clifford W. Barlow, MBBCh, DPhil, FRCS, of University Hospital Southampton (England) said in his invited commentary (J Thorac Cardiovasc Surg. 2016;152:978-80).
“This Expert Opinion is of particular interest because it relates to something commonly performed: conventional valve replacement,” Dr. Barlow said. Moreover, “BPVT is an under-recognized problem for which Dr. Egbe and colleagues concisely direct how future research should ascertain which diagnostic, preventive, and treatment strategies would improve long-term outcomes and avoid redo surgery.”
Dr. Egbe’s and colleagues’ recommendation of prolonged anticoagulation after bioprosthetic valve implantation complicates the selection of bioprosthetic valves – because cardiovascular surgeons frequently choose them to avoid anticoagulation, while accepting a higher risk of a reoperation because of valve degeneration, Dr. Barlow said.
And while Dr. Barlow noted this study found that porcine valves are not a predictor for BPVT, another Mayo Clinic study reported eight cases of BPVT, all in porcine valves (J Thorac Cardiovasc Surg. 2012;144:108-11). Nonetheless, the expert opinion by Dr. Egbe and colleagues is “relevant to much that is important – not only to improving outcomes with conventional valve replacement but also to these developing technologies,” Dr. Barlow said.
Dr. Egbe and colleagues make a “provocative” case that it is the presence of thrombus on bioprosthetic valves, and not degeneration, that causes valve dysfunction, Clifford W. Barlow, MBBCh, DPhil, FRCS, of University Hospital Southampton (England) said in his invited commentary (J Thorac Cardiovasc Surg. 2016;152:978-80).
“This Expert Opinion is of particular interest because it relates to something commonly performed: conventional valve replacement,” Dr. Barlow said. Moreover, “BPVT is an under-recognized problem for which Dr. Egbe and colleagues concisely direct how future research should ascertain which diagnostic, preventive, and treatment strategies would improve long-term outcomes and avoid redo surgery.”
Dr. Egbe’s and colleagues’ recommendation of prolonged anticoagulation after bioprosthetic valve implantation complicates the selection of bioprosthetic valves – because cardiovascular surgeons frequently choose them to avoid anticoagulation, while accepting a higher risk of a reoperation because of valve degeneration, Dr. Barlow said.
And while Dr. Barlow noted this study found that porcine valves are not a predictor for BPVT, another Mayo Clinic study reported eight cases of BPVT, all in porcine valves (J Thorac Cardiovasc Surg. 2012;144:108-11). Nonetheless, the expert opinion by Dr. Egbe and colleagues is “relevant to much that is important – not only to improving outcomes with conventional valve replacement but also to these developing technologies,” Dr. Barlow said.
Dr. Egbe and colleagues make a “provocative” case that it is the presence of thrombus on bioprosthetic valves, and not degeneration, that causes valve dysfunction, Clifford W. Barlow, MBBCh, DPhil, FRCS, of University Hospital Southampton (England) said in his invited commentary (J Thorac Cardiovasc Surg. 2016;152:978-80).
“This Expert Opinion is of particular interest because it relates to something commonly performed: conventional valve replacement,” Dr. Barlow said. Moreover, “BPVT is an under-recognized problem for which Dr. Egbe and colleagues concisely direct how future research should ascertain which diagnostic, preventive, and treatment strategies would improve long-term outcomes and avoid redo surgery.”
Dr. Egbe’s and colleagues’ recommendation of prolonged anticoagulation after bioprosthetic valve implantation complicates the selection of bioprosthetic valves – because cardiovascular surgeons frequently choose them to avoid anticoagulation, while accepting a higher risk of a reoperation because of valve degeneration, Dr. Barlow said.
And while Dr. Barlow noted this study found that porcine valves are not a predictor for BPVT, another Mayo Clinic study reported eight cases of BPVT, all in porcine valves (J Thorac Cardiovasc Surg. 2012;144:108-11). Nonetheless, the expert opinion by Dr. Egbe and colleagues is “relevant to much that is important – not only to improving outcomes with conventional valve replacement but also to these developing technologies,” Dr. Barlow said.
Recent findings on the incidence and pathophysiology of bioprosthetic valve thrombosis require revisiting existing guidelines against routine echocardiography in the first 5 years after bioprosthetic valve replacement and a longer course of anticoagulation therapy than the current standard of 3 months, investigators from the Mayo Clinic said in an expert opinion article in the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152;975-8).
In the expert commentary, Alexander C. Egbe, MBBS, of the departments of cardiovascular diseases and cardiovascular surgery at Mayo Clinic in Rochester, Minn., and coauthors explored the implications of their previous research, published in the Journal of the American College of Cardiology, that reported that bioprosthetic valve thrombosis (BPVT) is “not an uncommon cause of prosthetic valve dysfunction.” They identified BPVT in 46 of 397 (11%) bioprosthetic valves explanted at Mayo Clinic, and estimated the incidence of BPVT at 1% (J Am Coll Cardiol. 2015;66:2285-94), although Dr. Egbe and colleagues acknowledged the true incidence of BPVT is unknown, as is the time to occurrence. They noted that a different study design would be needed to determine that, along with the incidence of BPVT.
“The occurrence of BPVT is not restricted to surgically implanted bioprosthetic valves, but has also been observed after transcatheter aortic valve replacement (TAVR),” Dr. Egbe and colleagues said. They noted an association between BPVT and a lack of anticoagulation therapy in two earlier reports (N Engl J Med. 2015;373:2015-24; J Am Coll Cardiol. 2016;67:644-55). In their own study, 14 of 15 patients (93%) with diagnosed BPVT responded to anticoagulation therapy and avoided reoperation.
Dr. Egbe and coauthors did somewhat define the extent of the problem of misdiagnosis of BPVT. The diagnosis was considered in only 6 of 45 patients (13%) who had transesophageal echocardiography. “A significant proportion of the patients with BPVT were misdiagnosed as having structural failure and referred for reoperation,” Dr. Egbe and coauthors said. “This attests to the low level of awareness of the existence of BPVT and the lack of well-defined diagnostic criteria.”
They proposed a diagnostic model based on the echocardiography characteristics of three findings: a 50% increase in gradient within 5 years of implantation; increased cusp thickness; and abnormal cusp mobility. “The presence of all three echocardiographic features reliably diagnosed BPVT with a sensitivity of 72% and a specificity of 90%,” they said.
Their finding that 85% of BPVT cases occurred within 5 years of implantation flies in the face of clinical guidelines that state routine annual echocardiography is not recommended in that time frame (J Am So Echocardiogr. 2009;22;975-1014). But abnormal physical examination findings as a prerequisite for echocardiography may not be an effective method to diagnose BPVT. “In addition to transthoracic and transesophageal echocardiography, the use of other complementary imaging modalities, such as computed tomography, could be very effective in identifying subtle BPVT,” Dr. Egbe and colleagues said,
But preventing BPVT is more complicated. Clinical guidelines recommend anticoagulation of bioprosthetic valves for 3 months after implantation, but adhering to that guideline showed no protective effect against BPVT in their study, Dr. Egbe and coauthors said. Nor did antiplatelet therapy prove effective in preventing BPVT. However, a Danish study showed stopping anticoagulation within 6 months of surgical aortic valve replacement increased risk of thromboembolic complications and cardiovascular death (JAMA. 2012;308:2118-25). And the role of prosthesis type in BPVT “remains unclear.”
Dr. Egbe and coauthors acknowledged a number of questions persist with regard to BPVT in bioprosthetic valve dysfunction, including the true incidence, best screening method, risk factors, and the duration of anticoagulation, as well as the role of novel oral anticoagulants. “Answers to these questions will come from population-based prospective studies,” Dr. Egbe and colleagues said.
Dr. Egbe and his coauthors had no relationships to disclose.
Recent findings on the incidence and pathophysiology of bioprosthetic valve thrombosis require revisiting existing guidelines against routine echocardiography in the first 5 years after bioprosthetic valve replacement and a longer course of anticoagulation therapy than the current standard of 3 months, investigators from the Mayo Clinic said in an expert opinion article in the October issue of the Journal of Thoracic and Cardiovascular Surgery (2016;152;975-8).
In the expert commentary, Alexander C. Egbe, MBBS, of the departments of cardiovascular diseases and cardiovascular surgery at Mayo Clinic in Rochester, Minn., and coauthors explored the implications of their previous research, published in the Journal of the American College of Cardiology, that reported that bioprosthetic valve thrombosis (BPVT) is “not an uncommon cause of prosthetic valve dysfunction.” They identified BPVT in 46 of 397 (11%) bioprosthetic valves explanted at Mayo Clinic, and estimated the incidence of BPVT at 1% (J Am Coll Cardiol. 2015;66:2285-94), although Dr. Egbe and colleagues acknowledged the true incidence of BPVT is unknown, as is the time to occurrence. They noted that a different study design would be needed to determine that, along with the incidence of BPVT.
“The occurrence of BPVT is not restricted to surgically implanted bioprosthetic valves, but has also been observed after transcatheter aortic valve replacement (TAVR),” Dr. Egbe and colleagues said. They noted an association between BPVT and a lack of anticoagulation therapy in two earlier reports (N Engl J Med. 2015;373:2015-24; J Am Coll Cardiol. 2016;67:644-55). In their own study, 14 of 15 patients (93%) with diagnosed BPVT responded to anticoagulation therapy and avoided reoperation.
Dr. Egbe and coauthors did somewhat define the extent of the problem of misdiagnosis of BPVT. The diagnosis was considered in only 6 of 45 patients (13%) who had transesophageal echocardiography. “A significant proportion of the patients with BPVT were misdiagnosed as having structural failure and referred for reoperation,” Dr. Egbe and coauthors said. “This attests to the low level of awareness of the existence of BPVT and the lack of well-defined diagnostic criteria.”
They proposed a diagnostic model based on the echocardiography characteristics of three findings: a 50% increase in gradient within 5 years of implantation; increased cusp thickness; and abnormal cusp mobility. “The presence of all three echocardiographic features reliably diagnosed BPVT with a sensitivity of 72% and a specificity of 90%,” they said.
Their finding that 85% of BPVT cases occurred within 5 years of implantation flies in the face of clinical guidelines that state routine annual echocardiography is not recommended in that time frame (J Am So Echocardiogr. 2009;22;975-1014). But abnormal physical examination findings as a prerequisite for echocardiography may not be an effective method to diagnose BPVT. “In addition to transthoracic and transesophageal echocardiography, the use of other complementary imaging modalities, such as computed tomography, could be very effective in identifying subtle BPVT,” Dr. Egbe and colleagues said,
But preventing BPVT is more complicated. Clinical guidelines recommend anticoagulation of bioprosthetic valves for 3 months after implantation, but adhering to that guideline showed no protective effect against BPVT in their study, Dr. Egbe and coauthors said. Nor did antiplatelet therapy prove effective in preventing BPVT. However, a Danish study showed stopping anticoagulation within 6 months of surgical aortic valve replacement increased risk of thromboembolic complications and cardiovascular death (JAMA. 2012;308:2118-25). And the role of prosthesis type in BPVT “remains unclear.”
Dr. Egbe and coauthors acknowledged a number of questions persist with regard to BPVT in bioprosthetic valve dysfunction, including the true incidence, best screening method, risk factors, and the duration of anticoagulation, as well as the role of novel oral anticoagulants. “Answers to these questions will come from population-based prospective studies,” Dr. Egbe and colleagues said.
Dr. Egbe and his coauthors had no relationships to disclose.
FROM THE JOURNAL OF THORACIC AND CARDIOVASCULAR SURGERY
Key clinical point: Preoperative echocardiography can aid in the diagnosis of BPVT.
Major finding: Sixty-five percent of all reoperations for BPVT occurred more than a year after implantation and up to 15% of these reoperations occurred more than 5 years after the initial implantation.
Data source: Single-center retrospective study of 397 valve explants.
Disclosures: Dr. Egbe and his coauthors reported having no financial disclosures.
Disease severity, QOL outcome measures need standardizing in atopic dermatitis research
The number of outcome measures used to assess disease severity and quality of life (QOL) in randomized controlled trials of patients with atopic dermatitis (AD) has risen in recent years, according to a systematic review reports.
An overall lack of standardization of these outcome measures, however, is hindering the synthesis and translation of research into clinical practice, reported Mary K. Hill of the University of Colorado, Aurora, and her associates.
“Standardization of disease severity and QOL outcome instruments is essential for comparability among studies and improved quality of research evidence,” they wrote (J Am Acad Dermatol. 2016 Nov;75[5]:906-17. doi: 10.1016/j.jaad.2016.07.002).
Their systematic review of 135 randomized controlled trials (RCTs) identified 62 disease-severity and 28 quality-of-life instruments used in studies of patients with AD between July 2010 and July 2015.
This was a drastic increase from the 20 disease severity scales and 14 QOL indices identified in a previous systematic review of 382 RCTs of AD therapies conducted between 1985 and 2010, they noted.
In their review, the most frequently used disease severity scale was the Scoring Atopic Dermatitis (SCORAD) index, which was used in 79 studies. That was followed by the visual analogue scale (VAS) for pruritus, used in 30 studies; the Investigator’s Global Assessment (IGA) tool, used in 29 studies; and the Eczema Area and Severity Index (EASI), used in 28 studies.
But despite the well-documented burden of AD, the researchers noted that only 33% of the RCTs they reviewed assessed QOL. “This is up from the 18% of RCTs on AD that reported QOL outcomes between 1985 and July 2010, perhaps signifying gradually increased attention to patient emotional well-being,” Ms. Hill and her associates wrote.
A trend described by the authors, however, as “perhaps the most disconcerting” was that 75% of identified QOL instruments were used only once. “Continued increases in the reporting of QOL outcomes will be of limited benefit for interstudy comparisons if the diversity of measures used also continues to rise,” they said.
Adding to the confusion, the researchers found frequent overlap in the naming and content of instruments used in studies, making it even more challenging to identify meaningful comparisons.
There was no funding source, and the authors had no conflicts of interest to declare.
The number of outcome measures used to assess disease severity and quality of life (QOL) in randomized controlled trials of patients with atopic dermatitis (AD) has risen in recent years, according to a systematic review reports.
An overall lack of standardization of these outcome measures, however, is hindering the synthesis and translation of research into clinical practice, reported Mary K. Hill of the University of Colorado, Aurora, and her associates.
“Standardization of disease severity and QOL outcome instruments is essential for comparability among studies and improved quality of research evidence,” they wrote (J Am Acad Dermatol. 2016 Nov;75[5]:906-17. doi: 10.1016/j.jaad.2016.07.002).
Their systematic review of 135 randomized controlled trials (RCTs) identified 62 disease-severity and 28 quality-of-life instruments used in studies of patients with AD between July 2010 and July 2015.
This was a drastic increase from the 20 disease severity scales and 14 QOL indices identified in a previous systematic review of 382 RCTs of AD therapies conducted between 1985 and 2010, they noted.
In their review, the most frequently used disease severity scale was the Scoring Atopic Dermatitis (SCORAD) index, which was used in 79 studies. That was followed by the visual analogue scale (VAS) for pruritus, used in 30 studies; the Investigator’s Global Assessment (IGA) tool, used in 29 studies; and the Eczema Area and Severity Index (EASI), used in 28 studies.
But despite the well-documented burden of AD, the researchers noted that only 33% of the RCTs they reviewed assessed QOL. “This is up from the 18% of RCTs on AD that reported QOL outcomes between 1985 and July 2010, perhaps signifying gradually increased attention to patient emotional well-being,” Ms. Hill and her associates wrote.
A trend described by the authors, however, as “perhaps the most disconcerting” was that 75% of identified QOL instruments were used only once. “Continued increases in the reporting of QOL outcomes will be of limited benefit for interstudy comparisons if the diversity of measures used also continues to rise,” they said.
Adding to the confusion, the researchers found frequent overlap in the naming and content of instruments used in studies, making it even more challenging to identify meaningful comparisons.
There was no funding source, and the authors had no conflicts of interest to declare.
The number of outcome measures used to assess disease severity and quality of life (QOL) in randomized controlled trials of patients with atopic dermatitis (AD) has risen in recent years, according to a systematic review reports.
An overall lack of standardization of these outcome measures, however, is hindering the synthesis and translation of research into clinical practice, reported Mary K. Hill of the University of Colorado, Aurora, and her associates.
“Standardization of disease severity and QOL outcome instruments is essential for comparability among studies and improved quality of research evidence,” they wrote (J Am Acad Dermatol. 2016 Nov;75[5]:906-17. doi: 10.1016/j.jaad.2016.07.002).
Their systematic review of 135 randomized controlled trials (RCTs) identified 62 disease-severity and 28 quality-of-life instruments used in studies of patients with AD between July 2010 and July 2015.
This was a drastic increase from the 20 disease severity scales and 14 QOL indices identified in a previous systematic review of 382 RCTs of AD therapies conducted between 1985 and 2010, they noted.
In their review, the most frequently used disease severity scale was the Scoring Atopic Dermatitis (SCORAD) index, which was used in 79 studies. That was followed by the visual analogue scale (VAS) for pruritus, used in 30 studies; the Investigator’s Global Assessment (IGA) tool, used in 29 studies; and the Eczema Area and Severity Index (EASI), used in 28 studies.
But despite the well-documented burden of AD, the researchers noted that only 33% of the RCTs they reviewed assessed QOL. “This is up from the 18% of RCTs on AD that reported QOL outcomes between 1985 and July 2010, perhaps signifying gradually increased attention to patient emotional well-being,” Ms. Hill and her associates wrote.
A trend described by the authors, however, as “perhaps the most disconcerting” was that 75% of identified QOL instruments were used only once. “Continued increases in the reporting of QOL outcomes will be of limited benefit for interstudy comparisons if the diversity of measures used also continues to rise,” they said.
Adding to the confusion, the researchers found frequent overlap in the naming and content of instruments used in studies, making it even more challenging to identify meaningful comparisons.
There was no funding source, and the authors had no conflicts of interest to declare.
FROM THE JOURNAL OF THE AMERICAN ACADEMY OF DERMATOLOGY
Key clinical point: Disease severity and quality-of-life measures used in randomized controlled trials (RCTs) of atopic dermatitis (AD) need to be standardized so that study outcomes can be meaningfully compared and translated into clinical practice.
Major finding: The use of outcome measures used in atopic dermatitis RCTs has increased recently but still fall short in measuring quality-of-life outcomes.
Data source: A systematic review of 135 RCTs published between 2010-2015 that involved patients with AD.
Disclosures: There was no funding source, and the authors had no conflicts of interest to declare.
Vitamin D as an Add-On Therapy May Improve MRI Outcomes in MS
LONDON—High-dose vitamin D as an add-on therapy may improve MRI outcomes in patients with relapsing-remitting multiple sclerosis (MS), according to a study presented at the 32nd Congress of the European Committee for Treatment and Research in MS.
“Vitamin D is the precursor of a potent immunoregulatory molecule. However, whether supplementation of it improves outcomes is uncertain, since existing medical evidence is contradictory and involves small patient numbers,” said Raymond Hupperts, MD, PhD, Professor of Neurology at Maastricht University in the Netherlands.
Previous studies have found an association between low serum levels of vitamin D and a greater risk of developing MS and poor MS outcomes. As a result, Dr. Hupperts and colleagues conducted the SOLAR (Supplementation of VigantOL Oil Versus Placebo as Add-on in Patients With Relapsing Remitting MS Receiving Rebif Treatment) study to investigate the effects of vitamin D as add-on therapy in patients receiving subcutaneous interferon beta-1a.
The SOLAR study included 229 patients who were stratified by serum vitamin D level and randomized to vitamin D or placebo. Patients with serum vitamin D levels less than 150 nmol/L were randomized to one of two treatment groups. In treatment group one, patients were given 6,670 IU/day of vitamin D for four weeks, followed by 14,007 IU for 44 weeks or 92 weeks as an add-on to interferon beta-1a. In treatment group two, patients received matching placebo daily as an add-on therapy to interferon beta-1a.
The primary end point for the study was the proportion of patients with no evidence of disease activity (NEDA), which was defined as no relapses, no progression in Expanded Disability Status Scale (EDSS) score, and no gadolinium-enhancing T1 lesions or new or enlarging T2 MRI lesions, at Week 48. The secondary end points included annualized relapse rate (ARR) at Week 48, EDSS progression at Week 48, time to confirmed EDSS score progression, number of combined unique active (CUA) lesions per patient per scan at Week 48, number of T1-hypointense lesions at Week 48, and change from baseline in the total volume of T2 lesions at Week 48.
The ARR was lower in the vitamin D group, but the difference between groups was not statistically significant. However, the relapse rate in the vitamin D group was 0.28 versus 0.41 in the placebo group, a 30% difference at Week 48. In addition, vitamin D was associated with significantly better MRI outcomes. The number of CUA lesions at Week 48 was 1.09 in the vitamin D group and 1.49 in the placebo group. The change from baseline of the total volume of T2 lesions was 3.57% in the vitamin D group and 6.07% in the placebo group. Eighty-five percent of participants between ages 18 and 30 in the vitamin D group and 46.8% of this age group in the placebo arm had no new T1-hypointense lesions at Week 48.
“We conclude that SOLAR did not show a significant effect [of vitamin D] on the primary end point. However, because of ARR and MRI findings, we suggest a benefit of high-dose vitamin D,” said Dr. Hupperts. The researchers added that vitamin D supplementation may be more effective in early stages of MS, and they found no additional safety issues associated with high-dose vitamin D.
—Erica Tricarico
Suggested Reading
Ashtari F, Toghianifar N, Zarkesh-Esfahani SH, Mansourian M. High dose vitamin D intake and quality of life in relapsing-remitting multiple sclerosis: a randomized, double-blind, placebo-controlled clinical trial. Neurol Res. 2016;38(10):888-892.
LONDON—High-dose vitamin D as an add-on therapy may improve MRI outcomes in patients with relapsing-remitting multiple sclerosis (MS), according to a study presented at the 32nd Congress of the European Committee for Treatment and Research in MS.
“Vitamin D is the precursor of a potent immunoregulatory molecule. However, whether supplementation of it improves outcomes is uncertain, since existing medical evidence is contradictory and involves small patient numbers,” said Raymond Hupperts, MD, PhD, Professor of Neurology at Maastricht University in the Netherlands.
Previous studies have found an association between low serum levels of vitamin D and a greater risk of developing MS and poor MS outcomes. As a result, Dr. Hupperts and colleagues conducted the SOLAR (Supplementation of VigantOL Oil Versus Placebo as Add-on in Patients With Relapsing Remitting MS Receiving Rebif Treatment) study to investigate the effects of vitamin D as add-on therapy in patients receiving subcutaneous interferon beta-1a.
The SOLAR study included 229 patients who were stratified by serum vitamin D level and randomized to vitamin D or placebo. Patients with serum vitamin D levels less than 150 nmol/L were randomized to one of two treatment groups. In treatment group one, patients were given 6,670 IU/day of vitamin D for four weeks, followed by 14,007 IU for 44 weeks or 92 weeks as an add-on to interferon beta-1a. In treatment group two, patients received matching placebo daily as an add-on therapy to interferon beta-1a.
The primary end point for the study was the proportion of patients with no evidence of disease activity (NEDA), which was defined as no relapses, no progression in Expanded Disability Status Scale (EDSS) score, and no gadolinium-enhancing T1 lesions or new or enlarging T2 MRI lesions, at Week 48. The secondary end points included annualized relapse rate (ARR) at Week 48, EDSS progression at Week 48, time to confirmed EDSS score progression, number of combined unique active (CUA) lesions per patient per scan at Week 48, number of T1-hypointense lesions at Week 48, and change from baseline in the total volume of T2 lesions at Week 48.
The ARR was lower in the vitamin D group, but the difference between groups was not statistically significant. However, the relapse rate in the vitamin D group was 0.28 versus 0.41 in the placebo group, a 30% difference at Week 48. In addition, vitamin D was associated with significantly better MRI outcomes. The number of CUA lesions at Week 48 was 1.09 in the vitamin D group and 1.49 in the placebo group. The change from baseline of the total volume of T2 lesions was 3.57% in the vitamin D group and 6.07% in the placebo group. Eighty-five percent of participants between ages 18 and 30 in the vitamin D group and 46.8% of this age group in the placebo arm had no new T1-hypointense lesions at Week 48.
“We conclude that SOLAR did not show a significant effect [of vitamin D] on the primary end point. However, because of ARR and MRI findings, we suggest a benefit of high-dose vitamin D,” said Dr. Hupperts. The researchers added that vitamin D supplementation may be more effective in early stages of MS, and they found no additional safety issues associated with high-dose vitamin D.
—Erica Tricarico
Suggested Reading
Ashtari F, Toghianifar N, Zarkesh-Esfahani SH, Mansourian M. High dose vitamin D intake and quality of life in relapsing-remitting multiple sclerosis: a randomized, double-blind, placebo-controlled clinical trial. Neurol Res. 2016;38(10):888-892.
LONDON—High-dose vitamin D as an add-on therapy may improve MRI outcomes in patients with relapsing-remitting multiple sclerosis (MS), according to a study presented at the 32nd Congress of the European Committee for Treatment and Research in MS.
“Vitamin D is the precursor of a potent immunoregulatory molecule. However, whether supplementation of it improves outcomes is uncertain, since existing medical evidence is contradictory and involves small patient numbers,” said Raymond Hupperts, MD, PhD, Professor of Neurology at Maastricht University in the Netherlands.
Previous studies have found an association between low serum levels of vitamin D and a greater risk of developing MS and poor MS outcomes. As a result, Dr. Hupperts and colleagues conducted the SOLAR (Supplementation of VigantOL Oil Versus Placebo as Add-on in Patients With Relapsing Remitting MS Receiving Rebif Treatment) study to investigate the effects of vitamin D as add-on therapy in patients receiving subcutaneous interferon beta-1a.
The SOLAR study included 229 patients who were stratified by serum vitamin D level and randomized to vitamin D or placebo. Patients with serum vitamin D levels less than 150 nmol/L were randomized to one of two treatment groups. In treatment group one, patients were given 6,670 IU/day of vitamin D for four weeks, followed by 14,007 IU for 44 weeks or 92 weeks as an add-on to interferon beta-1a. In treatment group two, patients received matching placebo daily as an add-on therapy to interferon beta-1a.
The primary end point for the study was the proportion of patients with no evidence of disease activity (NEDA), which was defined as no relapses, no progression in Expanded Disability Status Scale (EDSS) score, and no gadolinium-enhancing T1 lesions or new or enlarging T2 MRI lesions, at Week 48. The secondary end points included annualized relapse rate (ARR) at Week 48, EDSS progression at Week 48, time to confirmed EDSS score progression, number of combined unique active (CUA) lesions per patient per scan at Week 48, number of T1-hypointense lesions at Week 48, and change from baseline in the total volume of T2 lesions at Week 48.
The ARR was lower in the vitamin D group, but the difference between groups was not statistically significant. However, the relapse rate in the vitamin D group was 0.28 versus 0.41 in the placebo group, a 30% difference at Week 48. In addition, vitamin D was associated with significantly better MRI outcomes. The number of CUA lesions at Week 48 was 1.09 in the vitamin D group and 1.49 in the placebo group. The change from baseline of the total volume of T2 lesions was 3.57% in the vitamin D group and 6.07% in the placebo group. Eighty-five percent of participants between ages 18 and 30 in the vitamin D group and 46.8% of this age group in the placebo arm had no new T1-hypointense lesions at Week 48.
“We conclude that SOLAR did not show a significant effect [of vitamin D] on the primary end point. However, because of ARR and MRI findings, we suggest a benefit of high-dose vitamin D,” said Dr. Hupperts. The researchers added that vitamin D supplementation may be more effective in early stages of MS, and they found no additional safety issues associated with high-dose vitamin D.
—Erica Tricarico
Suggested Reading
Ashtari F, Toghianifar N, Zarkesh-Esfahani SH, Mansourian M. High dose vitamin D intake and quality of life in relapsing-remitting multiple sclerosis: a randomized, double-blind, placebo-controlled clinical trial. Neurol Res. 2016;38(10):888-892.
Early Administration of Sertraline May Prevent Onset of Depression Following TBI
Sertraline may help to prevent the onset of depressive disorders after a traumatic brain injury (TBI), according to data published online ahead of print September 14 in JAMA Psychiatry.
“Our findings suggest that sertraline given at a low dosage early after TBI is an efficacious strategy to prevent depression after TBI,” said Ricardo E. Jorge, MD, Professor of Psychiatry and Behavioral Sciences at Baylor College of Medicine in Houston.
Every year, there are approximately 1.7 million cases of TBI in the United States. TBI contributes to 30% of all injury deaths and is a major cause of death and disability in the US, according to the Centers for Disease Control and Prevention.
Depressive disorders are common after TBI. In two studies, 58 of 157 patients developed a depressive disorder during the first year following TBI. Dr. Jorge and his colleagues conducted a double-blind, placebo-controlled study to assess the efficacy of sertraline in preventing depressive disorders following TBI. Their main outcome was time to onset of depressive disorder, as defined by the DSM-IV, associated with TBI.
“We hypothesized that the time from baseline to onset of depressive disorders would be greater in a group of patients randomized to receive sertraline treatment versus a group of patients randomized to receive placebo,” said Dr. Jorge. “We also hypothesized that, when compared with patients receiving placebo, patients receiving sertraline would show better performance in a set of neuropsychologic tests after six months of treatment.”
For the study, 94 patients were randomized to receive 100 mg/day of sertraline or placebo once daily for 24 weeks or until the development of a mood disorder. The age of participants ranged between 18 and 85, and patients had mild, moderate, or severe TBI. In addition, participants were required to have complete recovery of posttraumatic amnesia within four weeks of the traumatic episode. Patients with ongoing depression were excluded from the study. Furthermore, patients with mood disorders were required to have been in full remission for at least a year following discontinuation of treatment.
Researchers used the Mini-International Neuropsychiatric Interview and DSM-IV criteria to diagnose depressive disorders. In addition, participants were evaluated at baseline and at two, four, eight, 12, 16, 20, and 24 weeks. A Mini-International Neuropsychiatric Interview was administered via telephone on weeks six, 10, 14, 18, and 22.
The number of patients needed to treat to prevent development of depression after TBI at 24 weeks was 5.9. There were no incident cases of anxiety disorders, and one patient had suicidal ideation. Nearly all patients reported mild or moderate adverse events in the sertraline and placebo groups. Sexual adverse events were mild and did not significantly impact the quality of life of participants. Frequencies of dry mouth and diarrhea were higher among participants who received sertraline.
“The fact that small doses of sertraline are efficacious to prevent depression after TBI stands in sharp contrast to the lack of efficacy of antidepressants to treat depression in the chronic stage of TBI,” said Dr. Jorge.
Limitations of this study include its scarce representation of ethnic and racial minorities, small sample size, and limited follow-up following incident TBI.
—Erica Tricarico
Suggested Reading
Jorge RE, Acion L, Burin DI, Robinson RG. Sertraline for preventing mood disorders following traumatic brain injury. JAMA Psychiatry. 2016 Sep 14 [Epub ahead of print].
Sertraline may help to prevent the onset of depressive disorders after a traumatic brain injury (TBI), according to data published online ahead of print September 14 in JAMA Psychiatry.
“Our findings suggest that sertraline given at a low dosage early after TBI is an efficacious strategy to prevent depression after TBI,” said Ricardo E. Jorge, MD, Professor of Psychiatry and Behavioral Sciences at Baylor College of Medicine in Houston.
Every year, there are approximately 1.7 million cases of TBI in the United States. TBI contributes to 30% of all injury deaths and is a major cause of death and disability in the US, according to the Centers for Disease Control and Prevention.
Depressive disorders are common after TBI. In two studies, 58 of 157 patients developed a depressive disorder during the first year following TBI. Dr. Jorge and his colleagues conducted a double-blind, placebo-controlled study to assess the efficacy of sertraline in preventing depressive disorders following TBI. Their main outcome was time to onset of depressive disorder, as defined by the DSM-IV, associated with TBI.
“We hypothesized that the time from baseline to onset of depressive disorders would be greater in a group of patients randomized to receive sertraline treatment versus a group of patients randomized to receive placebo,” said Dr. Jorge. “We also hypothesized that, when compared with patients receiving placebo, patients receiving sertraline would show better performance in a set of neuropsychologic tests after six months of treatment.”
For the study, 94 patients were randomized to receive 100 mg/day of sertraline or placebo once daily for 24 weeks or until the development of a mood disorder. The age of participants ranged between 18 and 85, and patients had mild, moderate, or severe TBI. In addition, participants were required to have complete recovery of posttraumatic amnesia within four weeks of the traumatic episode. Patients with ongoing depression were excluded from the study. Furthermore, patients with mood disorders were required to have been in full remission for at least a year following discontinuation of treatment.
Researchers used the Mini-International Neuropsychiatric Interview and DSM-IV criteria to diagnose depressive disorders. In addition, participants were evaluated at baseline and at two, four, eight, 12, 16, 20, and 24 weeks. A Mini-International Neuropsychiatric Interview was administered via telephone on weeks six, 10, 14, 18, and 22.
The number of patients needed to treat to prevent development of depression after TBI at 24 weeks was 5.9. There were no incident cases of anxiety disorders, and one patient had suicidal ideation. Nearly all patients reported mild or moderate adverse events in the sertraline and placebo groups. Sexual adverse events were mild and did not significantly impact the quality of life of participants. Frequencies of dry mouth and diarrhea were higher among participants who received sertraline.
“The fact that small doses of sertraline are efficacious to prevent depression after TBI stands in sharp contrast to the lack of efficacy of antidepressants to treat depression in the chronic stage of TBI,” said Dr. Jorge.
Limitations of this study include its scarce representation of ethnic and racial minorities, small sample size, and limited follow-up following incident TBI.
—Erica Tricarico
Suggested Reading
Jorge RE, Acion L, Burin DI, Robinson RG. Sertraline for preventing mood disorders following traumatic brain injury. JAMA Psychiatry. 2016 Sep 14 [Epub ahead of print].
Sertraline may help to prevent the onset of depressive disorders after a traumatic brain injury (TBI), according to data published online ahead of print September 14 in JAMA Psychiatry.
“Our findings suggest that sertraline given at a low dosage early after TBI is an efficacious strategy to prevent depression after TBI,” said Ricardo E. Jorge, MD, Professor of Psychiatry and Behavioral Sciences at Baylor College of Medicine in Houston.
Every year, there are approximately 1.7 million cases of TBI in the United States. TBI contributes to 30% of all injury deaths and is a major cause of death and disability in the US, according to the Centers for Disease Control and Prevention.
Depressive disorders are common after TBI. In two studies, 58 of 157 patients developed a depressive disorder during the first year following TBI. Dr. Jorge and his colleagues conducted a double-blind, placebo-controlled study to assess the efficacy of sertraline in preventing depressive disorders following TBI. Their main outcome was time to onset of depressive disorder, as defined by the DSM-IV, associated with TBI.
“We hypothesized that the time from baseline to onset of depressive disorders would be greater in a group of patients randomized to receive sertraline treatment versus a group of patients randomized to receive placebo,” said Dr. Jorge. “We also hypothesized that, when compared with patients receiving placebo, patients receiving sertraline would show better performance in a set of neuropsychologic tests after six months of treatment.”
For the study, 94 patients were randomized to receive 100 mg/day of sertraline or placebo once daily for 24 weeks or until the development of a mood disorder. The age of participants ranged between 18 and 85, and patients had mild, moderate, or severe TBI. In addition, participants were required to have complete recovery of posttraumatic amnesia within four weeks of the traumatic episode. Patients with ongoing depression were excluded from the study. Furthermore, patients with mood disorders were required to have been in full remission for at least a year following discontinuation of treatment.
Researchers used the Mini-International Neuropsychiatric Interview and DSM-IV criteria to diagnose depressive disorders. In addition, participants were evaluated at baseline and at two, four, eight, 12, 16, 20, and 24 weeks. A Mini-International Neuropsychiatric Interview was administered via telephone on weeks six, 10, 14, 18, and 22.
The number of patients needed to treat to prevent development of depression after TBI at 24 weeks was 5.9. There were no incident cases of anxiety disorders, and one patient had suicidal ideation. Nearly all patients reported mild or moderate adverse events in the sertraline and placebo groups. Sexual adverse events were mild and did not significantly impact the quality of life of participants. Frequencies of dry mouth and diarrhea were higher among participants who received sertraline.
“The fact that small doses of sertraline are efficacious to prevent depression after TBI stands in sharp contrast to the lack of efficacy of antidepressants to treat depression in the chronic stage of TBI,” said Dr. Jorge.
Limitations of this study include its scarce representation of ethnic and racial minorities, small sample size, and limited follow-up following incident TBI.
—Erica Tricarico
Suggested Reading
Jorge RE, Acion L, Burin DI, Robinson RG. Sertraline for preventing mood disorders following traumatic brain injury. JAMA Psychiatry. 2016 Sep 14 [Epub ahead of print].