The Scholarly Kitchen is a group blog started by the Society for Scholarly Publishing in 2008. The blog posts by authors Kent Anderson, Phil Davis, David Crotty, Michael Clarke, etc. are an always interesting – and often thought-provoking – read about scholarly publishing. Two recent posts looked at peer review.
The “Burden” of Peer Review
In this blog post David Crotty argues that we overestimate the amount of time the typical researcher spends doing peer review. In his informal survey most researchers review 1-3 papers per month, and those reviewing many more often do so voluntarily (e.g. because they sit on an editorial board). I haven’t yet checked whether there are any formal surveys on the workload of peer review.
Post-publication Review: Is the Dialog of Science Really a Monologue?
In August the BMJ published a paper that looked at the adequacy of author replies to electronic letters to the editor. The cohort study found that authors are reluctant to respond to criticism to their work. In a companion editorial, David Schriger and Douglas Altman write about the possible reasons for this inadequate uptake of post-publication peer review, and that we need a change in culture to value public discussion. Philip David summarized the two papers and concluded that post-publication review may continue to be spotty and unreliable.
I would disagree with Philip Davis about the conclusions that can be drawn from the cohort study. We all know that many papers receive few if any online comments. But we should rather think about where we could be 3-5 years from now, and how to get there. A recent editorial by Thomas Liesegang in the Journal of Ophtalmology is very relevant to this discussion (Peer review should continue after publication, link via EASE Journal Blog). And Richard Smith gives a wonderful and much broader definition of post-publication peer review in a response to the BMJ editorial:
I would define post-publication review as the process whereby scientists and others decide whether a piece of work matters or not. I suggest that this doesn’t happen much through debate in the correspondence pages of journals, but rather through scientists and other consumers of research recommending others to pay attention to a piece of research, conducting other studies off the back of it, absorbing it into systematic reviews, beginning to act on its conclusions, throwing it in the bin, and taking a thousand other actions that constitute the “market of ideas.”
Using Jupyter Notebooks with GraphQL and the PID Graph
Two weeks ago DataCite announced the pre-release version of a GraphQL API [Fenner (2019)]. GraphQL simplifies complex queries that for example want to retrieve information about the authors, funding and data citations for a dataset with a DataCite DOI. ...
Infrastructure Tips for the Non-Profit Startup
When I started as DataCite Technical Director four months ago, my first post (Fenner, 2015) on this blog was about what I called Data-Driven Development. The post included a lot of ideas on how to approach development and technical infrastructure. ...