Posts Tagged ‘peer review’

Reviewing the Reviewers

Monday, December 19th, 2011

The editors of the Annals of Emergency Medicine rate the quality of reviews submitted by peer reviewers. Over 14 years, they collected close to 15,000 reviews from about 1,500 reviewers. Although originally for internal use, the ratings offered an opportunity to study the change in reviewers’ submissions over time.

0In a study published earlier this year, the editors found that nearly all reviewers received lower quality ratings over time.  Overall, the quality of reviews remained high because newly recruited reviewers turned in highly-scored reviews.

The authors speculate that their findings may provide evidence for cognitive decline as reviewers age. That seems overly speculative. More likely is that new reviewers take their responsibilities seriously while senior faculty have less to prove and more competing pressures on their time. Still, their experience makes them valuable voices in the peer review process.

Pitfalls of Peer Rerview

Monday, November 14th, 2011

In 2006, Nature tried an experiment. The journal receives about 10,000 manuscripts a year and sends 40% of them out for traditional peer review. In the trial, the editors asked authors if they would also submit their paper for open peer review where any scientists could leave signed comments. 71 authors agreed.

The journal promoted the experiment heavily on their website, through e-mail blasts, and with targeted invitations to scholars in the field. After four months, they reviewed the results. nature05535-i2Despite sizable web traffic to the site, 33 papers received no comments, and the most heavily commented on paper received only 10 replies.

Nor did the editors find the comments influential in their decisions whether to publish. They found that although many scientists approved of the idea of open review, very few would perform it.

Their experiment demonstrates both the promise and the pitfalls of social media. It opens up the possibility for dialogue, but it depends on self-motivated users to enrich the content.

Punking Peer Review

Wednesday, September 7th, 2011

The Open Information Science Journal is an open-access, peer-reviewed journal published by Bentham and indexed in Open J-Gate and Genamics JournalSeek.

Phil Davis, a postdoc at Cornell, was interested to see how rigorous the review process at the journal was. So, he used software to generate a realistic-looking but gibberish article called "Deconstructing Access Points." Access Points

As the figure on the right shows, the article looked scientific but in reality made no sense. Still, four months after submission, Dr. Davis received word from the editor that his article had passed peer review and was accepted for publication. All he had to do was send $800.

He declined to pay, but wrote about the experiment for a scholarly publishing blog. His trick recalls the Sokal hoax where a physicist submitted a nonsense paper to a humanities journal, got it published, and revealed it later. But where Sokal was poking fun at the meaninglessness of postmodernism, Davis is pointing to the lax regulation of open access journals.

Not all online journals are this craven, but it shows that peer review is no guarantee of quality.

Peer Reviewed Posters

Tuesday, July 19th, 2011

In an earlier post, I mentioned the growth of online peer review. One of those sites, Faculty of 1000, provides post-publication peer review. Content experts evaluate published papers and score them based on how importance.

The site has now expanded to include reviews of posters at academic conferences. PostersYou have to register to read the evaluations, but any user can scan the submissions. And it's another avenue for faculty to demonstrate the reach of their research.

Traditional peer review seems fairly entrenched for journal submissions, but when it comes to posters and conference papers, the web provides a universal platform for disseminating ideas.

Inside Peer Review

Monday, September 27th, 2010

I just read How Professors Think by Harvard sociologist Michele Lamont. In the book, Lamont goes behind the scenes of peer review by observing the deliberations of several nationally competitive grant panels. All universities talk in vague terms about valuing excellence, but in these concentrated deliberations, academics make plain what constitutes excellent work.

In interviews with 71 panel members--all seasoned professors--she asked what clues they look for in a grant proposal to signal excellence. Five qualities came up in over half the interviews:

  1. Significance (mentioned by 92% of respondents)
  2. Originality (89%)
  3. Clarity (61%)
  4. Methods (58%)
  5. Feasibility (51%)

These priorities indicate that the best proposals nail the big questions first. The applicant should start by asking, "Why does the research matter?" and "What is novel about my approach?" If those questions get answered clearly, then the proposal has won over the reviewers' sympathy.

Lamont's ethnography took place among panels evaluating humanities and social science awards, but its lessons hold true for the medical sciences as well.