Posts Tagged ‘research’

Challenging the IRB

Tuesday, March 22nd, 2011

When Brown University Associate Professor of Education Jin Li began her research on learning among Chinese immigrant children, she secured funding from private sources, prepared her methodology, and received approval from the IRB. As the research began, she noticed that her plan to provide $600 to all families who participated did not reflect the added effort low-income families expended on the learning assessments, so she decided to offer some families $600 and others $300.

She submitted her modified budget to the IRB and was rejected. Moreover, the Board told her she could not use data collected from families paid only $300 even though they had signed consent forms. Nor could she pay those families an additional $300 because she had run out of funds. So, she is suing Brown for harm to her research.

Originally, IRB approval was intended for research funded by federal sources. A new book, Ethical Imperialism, documents how that mission has grown to encompass all research with human subjects. Social scientists, in particular, find the restrictions ill suited for their work with interviews, archives, and oral histories. Just as under regulation can be harmful, so can overreach.

From Translation to Convergence

Thursday, January 6th, 2011

In biomedical research, the latest emphasis has been on translational science. That is, connecting bench work to clinical applications. At a recent MIT and AAAS conference, scientists hailed the next revolution in research: Convergence Science.

Their report points to the increasing integration of physical and engineering sciences into biomedical fields. Harnessing these different disciplines allows for innovative discoveries like a lab that engineered E. Coli bacteria to detect tumors and deliver drugs.

The authors of the paper caution that convergent science will require institutional restructuring. Traditional departmental divisions won't make sense as more work is done in teams. Traditional methods of recognizing individual achievement for promotion will also have to be reworked. Importantly, funding agencies like the NIH will have to retool their policies.

The proposal reminds me of E. O. Wilson's argument in Consilience. He advocated a return to the unification of scientific fields. As logical as convergence sounds, it comes across as a bit naive when so many other reforms have failed to shift the entrenched divisions of disciplines and academic achievement.

The Decline Effect

Tuesday, December 21st, 2010

A hallmark of science is replicability. Another team of researchers following the same methods should be able to reproduce the original results. As Jonah Lehrer writes in The New Yorker, there may be a crippling flaw in this principle.

Lehrer gives examples from studies of the benefits of antipsychotics to the powers of ESP where subsequent experiments yield decreasing effects. This could be a case of muting the influence of outliers on data. But it's so widespread that the decline effect points to something intrinsic to the practice of science.

John Ioannidis has written about the inherent biases in science. One article in PLoS Medicine entitled, Why Most Published Research Findings Are False, points to the way scientists influence data through their expectations. The preference of publishers for significant results also leads to inaccuracies. I would add that the mantle of objectivity also impedes scientists' ability to accept the more qualitative elements of their craft.

Tracking Impact

Friday, December 17th, 2010

The Cited Reference Search in the database Web of Science allows users to track how many times a published work was referenced in the academic literature. It indexes journals from the humanities to the sciences and includes conference proceedings.

As complete as that sounds, does it really capture how scholars use academic literature? With so much content migrating online, we can now track other measures of impact like number of times an article was downloaded, blogged about, or linked to on a website. describes new methods for assessing scholarly impact. One ranking uses a Google-like algorithm to weigh citations from prestigious journals most heavily. By that system, the New England Journal of Medicine comes out tops for medical journals in 2008. harnesses social networking by allowing researchers to upload their CVs and share content with other scholars. The site takes a holistic approach to evaluating merit by looking at the teaching and service part of an academic's record.


Friday, September 10th, 2010

To conduct research on the effectiveness of the mentoring program we're rolling out on the Boston University Medical Campus, I have been revising an application to the Institutional Review Board. The IRB oversees all research with human subjects to make sure investigators comply with federal regulations.

The motivation for such an oversight body is admirable. Especially with biomedical research, the potential for harm to research subjects is too great to go unchecked. But when it comes to more psycho-social research like the evaluation project I am proposing, the board's requirements can be cumbersome.

A New York Times article from 2007 points out the mission creep of IRBs, which originally applied just to research sponsored by federal grants. As universities require social scientists to go through the process, some of the protections for subjects end up sounding absurd. What's more, the regulations may interfere with a scholar's first amendment right to study and publish freely. I think of the New York Times itself, which does not have to seek any external approval before it writes potentially harmful pieces about subjects in the news.

In the end, though, the process of crafting an IRB application has been helpful for clarifying the safeguards my study has in place to protect research subjects. If it takes a cumbersome on-line form to get researchers to consider the ethical implications of their studies, then it's a worthwhile cost.

Conduct Unbecoming

Thursday, September 9th, 2010

Charges that Harvard psychology professor Marc Hauser falsified data have been widely reported. Hauser suffered professional embarrassment and the retraction of published papers, but his employer does not seem ready to impose any lasting sanctions. He will take a year off from teaching and advising and then, presumably, return to research.

The Harvard Crimson notes that when other prominent faculty have committed scientific misconduct, the punishment has been similarly lax. The head of psychiatry at Harvard Medical School resigned in 1988 after admitting plagiarism, but was then hired back as chief emeritus. According to sources at Harvard, the university has never revoked tenure for research lapses.

No doubt the exposure has been devastating for Hauser and will cloud any of his future accomplishments. Still, it is not clear that he incurred any institutional penalty for his actions. Falsifying data erodes the core of a university's academic mission. There should be some commensurate punishment for violating those values. Firing Hauser is not the solution either because he could return to research chastened and invigorated. As befitting an educational setting, Hauser's experience should become a teachable moment for the rest of the community, not an excuse to banish him for a year.

Living Longer

Friday, July 2nd, 2010

Department of Medicine researchers and their collaborators published a paper in Science this week. Using participants in the New England Centenarian Study, they identified 150 single nucleotide polymorphisms, or, less technically, parts of the genome, associated with long life.

As with most science stories picked up in the media, the details are less dramatic than the headlines. Senior author Thomas Perls explained to the Boston Globe that the findings do not portend a genetic test for longevity. Environmental factors play an even larger part in health.

Still, with UC Berkeley asking freshman to submit to a DNA test, the paper arrives at a time when people in the United States are curious about what our genes reveal.


Monday, June 28th, 2010

Since grad school, I've been a fan of the bibliographic software EndNote. In the late 1990s, I had pretty much abandoned the card catalog for finding references, but I had yet to discover the rich, full-text databases that would later emerge. So I didn't mind cutting and pasting authors' names and other publication information into the EndNote fields.

Nowadays, most researchers conduct literature reviews exclusively on-line. If an article's full-text does not appear on the web, I find myself questioning whether I really need that citation. Over the years, EndNote has added more web functionality with the ability to import fields and link to PDFs and URLs. Still, it remains a separate, proprietary system that sits on my hard drive.

I'm trying a new program called Zotero. It is a way of organizing citations directly within your browser. Because it's integrated into the very frame in which you search for sources, it captures text easily and seamlessly. Citations can be tagged like blog posts and organized into collections. With some more setting up, you can access the bibliographies from a remote computer. And it's free! I know there are a lot of competing software out there, but Zotero seems to have been designed by researchers for other researchers.

Poisonous Plastics

Tuesday, May 25th, 2010

This weekend 60 Minutes broadcast a report about the dangers posed by phthalates. Phthalates are chemicals found in flexible plastic consumer items like shower curtains, vinyl raincoats, and even rubber duckies. Research by University of Rochester professor Shanna Swan has shown that exposure to phthalates interrupts the production of testosterone in young boys, leading to misshapen sexual organs.

Dr. Swan's research is a good example of the power of medical research. It has led to a federal law banning the use of phthalates in the manufacture of toys.

At the same time, the 60 Minutes piece demonstrates the dangers of scientists bringing their findings to the public. Lesley Stahl follows the media rule of showing both sides to every argument whether they are balanced or not. She interviews a businessman who must spend $8,000 to test a toy microscope for the outlawed toxin. She also talks to a scientist who says that experiments done in rats do not necessarily apply to humans.

When biomedical scientists communicate their conclusions more broadly, they have to be aware that not everyone understands what proof means in a research setting. They have to be particularly careful to specify what their findings can mean and to educate reporters about the tentative nature of scientific knowledge.

Conflicts of Interest

Monday, May 24th, 2010

The National Institutes of Health recently unveiled new guidelines governing conflicts of interest in biomedical research. These suggestions will be subject to public comment for sixty days.

According to, which published a report on the new regulations, most stakeholders seemed pleased with the move to increased transparency. Everyone from the AAMC to Iowa Senator Charles Grassley applaud the idea of holding institutions accountable for revealing financial ties between researchers and industry.

In an age of eroding public confidence in institutions, it seems crucial that biomedical research remain untainted by the appearance of impropriety. Still, I'm struck by what the additional reporting assumes about the scientific process.

If having a financial stake in the research can affect the study's outcome, what does it say about the objectivity of science? Are PIs intentionally manipulating data to produce a favorable result, or, are there less conscious ways that bias creeps in?