Peer-Review: Please, Tell Me It's Worth the Time and Effort To Do It Right
Wednesday, September 7, 2011 at 11:55AM
Dave Brodbeck in Dave Mumby, Guest Post, Peer Review, Rant, Rant, Science, Work, guest post

Many thanks to Dave Brodbeck for inviting me to vent where he vents.

I have more than a few ruminations to share, but only one today, and I promise not to be cynical. Really. Instead, I want to try to do something positive whenever I write for public consumption, and that includes when I'm complaining. My goal is to raise issues that affect science students and the community at large, and hopefully inspire at least a few students or researchers who read my comments to have similar discussions so that they might somehow contribute to solutions -- or at least avoid being part of some problem or another. Oh, and of course by that I mean problems, in my view.

Today, the problem I want to discuss has to do with the peer-review process in science and research. Before I start railing, however, I want to be clear about something: I believe peer-review is currently the best way we have to apply quality control to what counts as knowledge, useful information, or sound ideas. It is not a perfect process, however, and academic folks have been debating its various problems for decades. I will make no attempt here to contribute to an academic discussion about any of the oft-mentioned limitations and imperfections of the peer-review process. Instead, I'm going to gripe a bit about something I encounter, from time to time, when I serve as a reviewer for some science journal editors.

As a university professor with an active research program, I spend a lot of time participating in the peer-review process. I am often asked to review research reports that have been submitted for publication in scientific journals, or grant applications that other researchers have submitted to some funding agency or another, in either Canada, the U.S., or the U.K. Serving as a peer-reviewer is something that one normally does on a purely voluntary basis (although I recently received a small honorarium for reviewing a grant application for a special award competition -- that was the first time, and it wasn't necessary, but I took it without any hesitation or guilt). I don't agree to serve as a reviewer every time I am asked. I receive about 3 or 4 requests from journal editors each month, and I accept about two-thirds of them; I decline to review when I am unable to give a manuscript the time and attention it deserves because of other obligations. I accept all requests to review research grant applications.

I consider service as a peer-reviewer to be an extremely important part of my job, even though it is all voluntary, and I receive absolutely no tangible reward or recognition for this service. Except in rare cases, reviewers are anonymous and only the journal editor knows who provided the reviews. In most academic departments within a large university, no one really has any idea how much time a colleague is spending on peer review. It is not normally listed in a c.v. (although it could be, perhaps should be, and I am sure some people include it). Deans and department heads normally have no idea how much a particular faculty member contributes to peer review, and most of them probably do not care very much, as this activity does not bring any obvious returns to the department or the university. Only my wife and I really know how much time I spend on it. So, peer-review is kind of like making an anonymous donation to some charity -- you do it because you want to help, you know its the right thing to do, and you don't need any recognition or payback for doing it.

My guess is that a lot of my colleagues get about the same number of requests to review as I do, and that some get more and some get fewer. I would also guess that, like me, many of my colleagues refuse some, but accept most invitations to review work submitted for publication in their areas of expertise. There is probably a lot of variability, though. It's also my guess that, like me, most of my colleagues also readily agree to review grant applications whenever they are asked to do so. I'm just guessing, but I'd bet that most of them also take the responsibility of peer-review seriously, and that the majority of them do an excellent job on a consistent basis.

Today, my rant is about those other peers of mine who agree to review, but then do a lousy job of it, primarily because they just aren't trying hard enough and putting in the time needed to be thorough. In my opinion, the only good reason for wanting to be a reviewer is because you have intrinsic motivation to help assure quality and integrity of the data, and the scientific soundness of the arguments, that get published. What other justification is there? Yet, sometimes I get the feeling that some people have different reasons for agreeing to review for science journal editors. I mean, if you really care about something enough to repeatedly do it voluntarily and anonymously, then wouldn't you want to spend as much time on it as needed to get it done properly?

Here is what set me off, and made me decide to write about this topic: A couple of weeks ago, I spent the better part of two days working on my evaluation and review of a manuscript that had been submitted for publication in the highly-regarded journal, Brain Research. The manuscript was about 40 pages (near the average length), and it was a general review of previous research on an interesting and somewhat important subject. Two days was not an unusual amount of the time for me to spend evaluating and writing my recommendations for a paper like this one. I take the responsibility very seriously -- after all, once a paper is published, it becomes part of the public domain of knowledge for posterity. Millions upon millions of taxpayer dollars are spent to support scientific research, and this is one of the main reasons why I care about the integrity of the "facts" that are reported by other scientists, and the soundness of the interpretations and theories they put forth. I even feel somewhat honoured that I have this privilege to contribute to something that I think is important, and I want to do my best job of it at all times, using my most unbiased and objective scrutiny. Editors typically give reviewers two or three weeks to evaluate a manuscript and submit a report that indicates the reviewer's disposition and concerns, requests for changes, etc, concerning the manuscript. I almost always submit my review a few days after the deadline. I seem to need the sense of urgency I get from the courtesy email reminder from the editor that the deadline is looming. I procrastinate, but when I eventually start the task, I put most everything else aside until it is done. Some times of year are a bit busier than others (grant review season), but averaged over the whole year, I estimate that I spend about 8 - 10 hours on peer-review in a typical week. That's a little more than one full workday, for people with a regular 9-5 job. I think its worth it. I have no doubt that authors and editors are happy to know someone is willing to put in the time.

For this manuscript, the editor sought three reviewers. As is usually the case, after the editor received all three reviews, a decision was made about the fate of the manuscript, and there was an opportunity for each of the three reviewers to see the decision letter, and to view the other two reviews.

To be blunt, I was pissed when I saw the other two reviews. Both consisted of a point-form list of comments, none of which reflected any kind of deep analysis. Most of the comments pointed out minor errors, or asked for clarification on some tangential point or issue. Both seemed like a list of comments that one might make while reading through a manuscript for the first time; you know, short notes on all the obvious shortcomings that jump out at you, like inconsistent arguments, errors of logic, certain typos, or bad grammar.

Its not that any of the comments from the other reviewers were off-base -- in fact, I didn't see any that I disagreed with. Both reviewers also gave pretty clear directions about what the authors needed to do to address their concerns. (That last part is really important. It can sometimes be frustrating for authors when a reviewer has a problem with something in a manuscript, but does not say what he or she believes should be done to fix it). But I really doubt that either of the other two reviewers read through the manuscript a second time, let alone a third time, or that either of them checked the references to see if there were any obvious omissions, or checked to see that important previous work was being cited properly, or stepped back to consider whether the paper actually accomplished the objectives it stated at the outset, or whether it was an original analysis versus a run-of-the mill literature review. These bases take a lot of time to cover, but if the peer-review process does not do it, they will not be covered. There are many other big-picture analyses or beneath-the-surface assessments that a reviewer can try to accomplish when evaluating a manuscript. No individual reviewer can be expected to undertake more than a few of them, but at least a few should be expected and delivered.

Again, I want to be clear -- I'm not disrespecting the intelligence or abilities of my peer experts. I don't think very many of them are actually unqualified to evaluate the scientific work of others. Frankly, I think the problem is just a combination of laziness, lack of pride in good work, and in some cases, a dishonourable reason for wanting to do the job in the first place. I wonder…, Does anyone ever agree to help editors with the review process with the hope that it will make that editor more likely to be positively disposed toward their work if they submit it to the same journal? Do people ever agree to review so they can be in a position to quash any data or arguments that challenge their own conclusions? Can any of them actually have such contempt for the noble pursuit of scientific knowledge? Absolutely, and without a doubt, some of my "peers" are just that dastardly in their thoughts and deeds. I know there are jerks and bastards out there. I have met many of them over the years.

So, what can be done? I sure don't know. But, by talking about these kinds of things with my graduate students, I hope to instill in them a sense of responsibility and a desire to keep their integrity while their careers develop and evolve. This way, I hope, they are likely to do a good job of providing peer-review when the time comes for them to contribute. Other than that, I just wish more editors had the balls to give reviewers frank feedback on the quality of the reviews they submit. Maybe some of the editors don't care enough? You know, I have some thoughts about that one.

If enough people want (enough is only 2 or 3), I'll write again, next time about why I believe science and research in the academic world has systemic flaws that encourage certain bad behaviors by scientists and researchers. Some of the flaws in the system can even explain the half-assed or disingenuous peer-review that I've been complaining about. But even worse, the problems in the system account for a great deal of squandering of taxpayers money -- millions of tax dollars that are allocated to support scientific research. The industry is such that there is an incredible amount of waste and inefficient expenditure, and a lot of money is used in ways that aim to benefit some peoples' research careers more than they aim to benefit science.

Actually, I think I'll just keep writing about this stuff whether or not anyone else is reading it. It's like therapy.

Dave Mumby is behavioral neuroscientist and a professor in the psychology department at Concordia University in Montréal. He and his students study memory and brain functions. Dave is an academic advisor, and the author of Graduate School: Winning Strategies for Getting In. He also has a blog and is a frequent contributor to MyGraduateSchool.com, a website that helps undergraduate students prepare and apply successfully to graduate school.

Article originally appeared on Dave Brodbeck (http://davebrodbeck.com/).
See website for complete article licensing information.