131 Your Data Is Showing: The Fight for Transparency in Research Is On
Printer Friendly Email a Friend PDF

Dynamic Chiropractic – January 1, 2016, Vol. 34, Issue 01

Your Data Is Showing: The Fight for Transparency in Research Is On

By Anthony Rosner, PhD, LLD [Hon.], LLC

Data manipulation, whether it occurs in the form of skewed voter registration, gerrymandering or outright ballot fraud, is unfortunately a practice with which we are all too familiar. But scientific data? Let's start with these two glaring examples from the past decade or so:

  • Reports released by the General Accounting Office on March 4, 2002, indicated that a Pentagon agency, two major military contractors and an independent research team led by MIT researchers produced flawed studies which exaggerated the success of a key test used to justify spending billions of dollars on a fledgling national missile defense program. In one test, an infrared sensor built by Boeing failed to cool to a sufficient temperature to allow it to function properly. A noisy power supply was also present. The excess heat and noise could have caused a distortion of up to 200-fold, causing the senor to detect targets where none existed. A congressional source close to the GAO suggested two-thirds of the data may have been tossed out to make the test look like a success.1
  • In clinical trials conducted by Pfizer comparing the efficacy of two antifungal agents (fluconazole and amphotericin B), the latter product was administered inappropriately in most cases so its capacity to be effective with fluconazole (Pfizer's own product) would be fatally compromised from the get-go. Not surprisingly, it was. Yet these results got past peer reviewers into the indexed literature. Also not surprisingly, 92 percent of all patients were generated by funding from none other than Pfizer.

Undermining Sound Science

data - Copyright – Stock Photo / Register Mark Rounding up the usual suspects, to coin a phrase uttered by Captain Renault in the classic film "Casablanca," we can easily identify conflict of interest as the prime culprit. In this regard, dating back 15 years, Greg Koski, director of human research affairs at Partners Health Care, pointed out that in research, "conflicts of interest are very real, very serious and a threat to our entire endeavor. During the past five years, they may have gotten out of control. Public trust has been eroded." Indeed.

As sort of a field guide to identifying tactics used to undermine sound science, Rosenstock and Lee did a commendable job in 2002 by listing the following four factors:4

  1. Economic manipulation: This is seen by (a) the delay of research publication because results were negative or a patent application was in progress; (b) a disparity of the hazardous classification of chemicals depending upon whether or not the studies were sponsored by the authors who had financial ties to industry; (c) the hiring of scientists by tobacco or pharmaceutical industries to discredit secondhand smoke problems or favoring their products; and – an example well-known to the chiropractic community – (d) the lobbying for zero funding of what was then the Agency for Health Policy and Research (now AHRQ) by orthopedic surgeons after the agency's publication of back pain guidelines which favored nonsurgical to surgical approaches.
  2. Delay: Examples include (a) initiating congressional reports or inquiries, retarding work on (for example) ergonomic standards; (b) demanding different peer reviews; or even (c) initiating litigation.
  3. Hidden Identities: Vested interests sometimes have hidden their identities by masquerading as grassroots organizations. Examples include (a) the National Coalition on Ergonomics, actually opposing ergonomics standards; (b) the Food Chain Coalition, representing the pesticide industry; (c) the Doctors for Integrity in Research and Public Policy, opposing gun control and handgun research; and (d) the Center for Patient Advocacy, the aforementioned orthopedic group that lobbied against the AHCRP.
  4. Harassment: Self-evident. The less said, the better.

Failing to Report Outcomes (and Then Denying They Exist)

Adding further suspicion that the reporting of scientific results did not always recognize Robert's Rules of Order were the findings in 2004 that 71 percent of clinical trials measuring the efficacy of a therapeutic intervention had at least one unreported outcome, and 60 percent of trials measuring a harm outcome also had at least one unreported outcome. Worse, when trial authors were queried about "unreported outcomes," 86 percent initially denied that such unreported outcomes existed in their work, despite the fact that all of their trials showed clear evidence of unreported outcomes.

Finally, in what strikes me as a parting gesture, this exposé pointed out that literature reviews tend to overestimate the effects of a given therapy, particularly when the alternatives are expensive, ineffective or harmful.5

The Fight for Transparency Begins

To tame the outlaws in Dodge City, one of the world's leading scientific journals chose to take matters into its own hands and produced the most comprehensive series of guidelines to date on the publication of studies in the basic sciences. Specifically, the journal Science called for the adoption of clearly defined regulations demanding the sharing of raw data and methods.

The new guidelines, known as TOP (Transparency for Openness Promotion), established a system intended to be applied across any number of diverse fields for journal publication. These guidelines set forth three levels of categories for disclosure, in ascending stringency:

  • Level 1 demanded the journal require authors to indicate whether raw data is available, and if so, where.
  • Level 2 insisted the data be deposited to a trusted databank.
  • Level 3 stipulated that, in addition to the posting of data, an independent group be called upon to perform an independent audit and analysis of the data driving a research paper prior to its publication.

On top of this, preregistration of the outline of study methods, design and hypothesis – already the law for most clinical drug trials and required by most journals – was mandated as well.6

All the reason to hope readers of scientific papers remain vigilant or even skeptical, and feel free to take advantage of the available means to acquire critical skills Otherwise, you can't help but cast a wary eye on such quotations as from the actor Edward G. Robinson, who, as a gangster in the film "Key Largo," schools Humphrey Bogart on the fine art of data "cleaning":

"Let me tell you about Florida politicians. I make them. I make them out of whole cloth, just like a tailor makes a suit. I get their name in the newspaper. I get them some publicity and get them on the ballot. Then after the election, we count the votes. And if they don't turn out right, we recount them. And I recount them again. Until they do."

References

  1. Abel D. "MIT Team Tied to Questionable Missile Studies. "Boston Globe, March 4, 2002.
  2. Johansen HK, Gotzsche PC. Problems in the design and reporting of trials of antifungal agents encountered during meta-analysis. J Am Med Assoc, 1999;282(18):1752-1759.
  3. Rosner A. Fables or foibles: inherent problems with RCTs. J Manip Physio Ther, 2003;26(7):460-467.
  4. Rosenstock L, Lee LJ. Attacks on science: the risks to evidence-based policy. Am J Public Health, 2002;92(1):14-18.
  5. Chan A-W, Hrobartsson A, Haahr MT, Gotzsche PC, Altman G. Empirical evidence for selective reporting of outcomes in randomized trials: comparison of protocols to published articles. J Am Med Assoc, 2004;291(20):2457-2465.
  6. Carey B. "Top Journal Puts Out Comprehensive Guidelines for Publication of Science Studies." The New York Times, June 26, 2015.

Click here for previous articles by Anthony Rosner, PhD, LLD [Hon.], LLC.


To report inappropriate ads, click here.