In order to better integrate my blog with my website, better manage comment spam, and reduce my dependence on Google, this blog has moved to In order to avoid broken links I won't be deleting content from here, but no new content will be added, so please update your bookmarks and feeds.

Saturday 5 October 2013


For a while I've wanted to transfer my blog to a WordPress platform on my own domain, for a few reasons:
  • it's nice owning one's own house;
  • it keeps all my stuff together in my own control;
  • it reduces my dependence on Google; and
  • the comment spam on Blogger is driving me up the wall - Google is startlingly bad at managing it, so I get email notifications for all of it and haven't worked out how to filter in my inbox either.
So after a certain amount of procrastination, it's now done. Tweaking the theme took a while, but the import process went pretty smoothly with just a couple of things I had to fix by hand. So now all posts and comments have been duplicated at and the new RSS feed is Please update any bookmarks or feeds accordingly!

(In order to avoid broken links I won't be deleting content from here, so it should remain as long as Google allows; however no new content will be added and in due course I'll disable commenting.)

Friday 4 October 2013

Open access and peer review

We’re likely to be hearing about John Bohannon's new article in Science, "Who's afraid of peer review?" Essentially the author created 304 fake papers with bad science and submitted one each to an 'author-pays' open access journal to test their peer review. 157 of the journals accepted it, 98 rejected it; other journals were abandoned websites or still have/had the paper under review at time of analysis. (Some details are interesting. PLOS ONE provided some of the most rigorous peer review and rejected it; OA titles from Sage and Elsevier and some scholarly societies accepted it.)

Sounds pretty damning, except...

Peter Suber and Martin Eve each write a takedown of the study, both well worth reading. They list many problems with the methodology and conclusions. (For example, over two-thirds of open access journals listed on DOAJ aren't "author-pays" so it's odd to exclude them.)

But the key flaw is even more obvious than the flaws in the fake articles: his experiment was done without any kind of control. He only submitted to open access journals, not to traditionally-published journals, so we don’t know whether their peer review would have performed any better. As Mike Taylor and Michael Eisen point out, this isn't the first paper with egregiously bad science that's slipped through Science's peer review process either.