Article Type
Changed
Fri, 04/04/2014 - 05:00
Display Headline
Group uncovers inconsistent reporting of AEs, deaths

Mark Helfand, MD

OHSU School of Medicine

A new analysis indicates that researchers sometimes exclude unfavorable trial data, whether reporting results in medical journals or on ClinicalTrials.gov.

A group of investigators analyzed trials in which data was reported both on the government website and in journals. And they found that discrepancies between the 2 sources were common.

Adverse events (AEs) were more likely to be reported on ClinicalTrials.gov and excluded from reports in  journals.

But deaths seemed to be underreported or inconsistently reported on ClinicalTrials.gov when compared to journals.

“This is the most comprehensive study of ClinicalTrials.gov to date,” said study author Mark Helfand, MD, of Oregon Health & Science University.

“It shows that patients and clinicians could use [the site] to find information that is not available in the published literature, particularly to get more complete information about the harms of various treatment options. It also shows that, to best serve the public, death rates and some other items in ClinicalTrials.gov should be audited to keep them up to date.”

Dr Helfand and his colleagues reported these findings in Annals of Internal Medicine.

The researchers evaluated 110 trials that were completed by January 1, 2009, and reported on ClinicalTrials.gov. The team looked only at trials completed by 2009 to allow for the results to be later published in medical journals. Most of the trials were industry-sponsored.

Analyses revealed a number of discrepancies between data on ClinicalTrials.gov and in medical journals. For instance, 80% (n=88) of the trials had inconsistencies in secondary outcome measures.

In 15% (n=16) of trials, there were inconsistencies in the description of the primary outcome. And in 20% (n=22) of trials, there were inconsistencies in the primary outcome value. Still, in most cases, these discrepancies were small and did not affect the statistical significance of the results.

There were inconsistencies in AE reporting as well. Of the 84 trials in which a serious AE was reported on ClinicalTrials.gov, 11 published papers did not mention serious AEs, 5 reported that there were no serious AEs, and 21 reported a different

number of serious AEs.

So of the trials that had inconsistent AE reporting, 87% had more serious AEs listed on ClinicalTrials.gov than in the journal.

On the other hand, deaths seemed to be underreported on ClinicalTrials.gov compared to journals. For instance, in 17% of trials that did not report deaths on ClinicalTrials.gov, deaths were reported in the journal article.

Prior studies have indicated ClinicalTrials.gov does not have a uniform way of reporting deaths, and that may lead to inconsistencies.

Publications
Topics

Mark Helfand, MD

OHSU School of Medicine

A new analysis indicates that researchers sometimes exclude unfavorable trial data, whether reporting results in medical journals or on ClinicalTrials.gov.

A group of investigators analyzed trials in which data was reported both on the government website and in journals. And they found that discrepancies between the 2 sources were common.

Adverse events (AEs) were more likely to be reported on ClinicalTrials.gov and excluded from reports in  journals.

But deaths seemed to be underreported or inconsistently reported on ClinicalTrials.gov when compared to journals.

“This is the most comprehensive study of ClinicalTrials.gov to date,” said study author Mark Helfand, MD, of Oregon Health & Science University.

“It shows that patients and clinicians could use [the site] to find information that is not available in the published literature, particularly to get more complete information about the harms of various treatment options. It also shows that, to best serve the public, death rates and some other items in ClinicalTrials.gov should be audited to keep them up to date.”

Dr Helfand and his colleagues reported these findings in Annals of Internal Medicine.

The researchers evaluated 110 trials that were completed by January 1, 2009, and reported on ClinicalTrials.gov. The team looked only at trials completed by 2009 to allow for the results to be later published in medical journals. Most of the trials were industry-sponsored.

Analyses revealed a number of discrepancies between data on ClinicalTrials.gov and in medical journals. For instance, 80% (n=88) of the trials had inconsistencies in secondary outcome measures.

In 15% (n=16) of trials, there were inconsistencies in the description of the primary outcome. And in 20% (n=22) of trials, there were inconsistencies in the primary outcome value. Still, in most cases, these discrepancies were small and did not affect the statistical significance of the results.

There were inconsistencies in AE reporting as well. Of the 84 trials in which a serious AE was reported on ClinicalTrials.gov, 11 published papers did not mention serious AEs, 5 reported that there were no serious AEs, and 21 reported a different

number of serious AEs.

So of the trials that had inconsistent AE reporting, 87% had more serious AEs listed on ClinicalTrials.gov than in the journal.

On the other hand, deaths seemed to be underreported on ClinicalTrials.gov compared to journals. For instance, in 17% of trials that did not report deaths on ClinicalTrials.gov, deaths were reported in the journal article.

Prior studies have indicated ClinicalTrials.gov does not have a uniform way of reporting deaths, and that may lead to inconsistencies.

Mark Helfand, MD

OHSU School of Medicine

A new analysis indicates that researchers sometimes exclude unfavorable trial data, whether reporting results in medical journals or on ClinicalTrials.gov.

A group of investigators analyzed trials in which data was reported both on the government website and in journals. And they found that discrepancies between the 2 sources were common.

Adverse events (AEs) were more likely to be reported on ClinicalTrials.gov and excluded from reports in  journals.

But deaths seemed to be underreported or inconsistently reported on ClinicalTrials.gov when compared to journals.

“This is the most comprehensive study of ClinicalTrials.gov to date,” said study author Mark Helfand, MD, of Oregon Health & Science University.

“It shows that patients and clinicians could use [the site] to find information that is not available in the published literature, particularly to get more complete information about the harms of various treatment options. It also shows that, to best serve the public, death rates and some other items in ClinicalTrials.gov should be audited to keep them up to date.”

Dr Helfand and his colleagues reported these findings in Annals of Internal Medicine.

The researchers evaluated 110 trials that were completed by January 1, 2009, and reported on ClinicalTrials.gov. The team looked only at trials completed by 2009 to allow for the results to be later published in medical journals. Most of the trials were industry-sponsored.

Analyses revealed a number of discrepancies between data on ClinicalTrials.gov and in medical journals. For instance, 80% (n=88) of the trials had inconsistencies in secondary outcome measures.

In 15% (n=16) of trials, there were inconsistencies in the description of the primary outcome. And in 20% (n=22) of trials, there were inconsistencies in the primary outcome value. Still, in most cases, these discrepancies were small and did not affect the statistical significance of the results.

There were inconsistencies in AE reporting as well. Of the 84 trials in which a serious AE was reported on ClinicalTrials.gov, 11 published papers did not mention serious AEs, 5 reported that there were no serious AEs, and 21 reported a different

number of serious AEs.

So of the trials that had inconsistent AE reporting, 87% had more serious AEs listed on ClinicalTrials.gov than in the journal.

On the other hand, deaths seemed to be underreported on ClinicalTrials.gov compared to journals. For instance, in 17% of trials that did not report deaths on ClinicalTrials.gov, deaths were reported in the journal article.

Prior studies have indicated ClinicalTrials.gov does not have a uniform way of reporting deaths, and that may lead to inconsistencies.

Publications
Publications
Topics
Article Type
Display Headline
Group uncovers inconsistent reporting of AEs, deaths
Display Headline
Group uncovers inconsistent reporting of AEs, deaths
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica