A recent Nature News article by Geoff Brumfiel (Science journalism: Supplanting the old media?) has stirred up many interesting discussions about the relationship of science blogging and traditional science journalism. Good starting places to follow these discussions (and engage in them) are Technorati, Nature.com Blogs and FriendFeed.
Science blogging extends, but also threatens traditional science journalism. At the same time, aggregators and microblogging services such as FriendFeed, Twitter, but also Facebook also are both an enhancementent and a threat to science blogs. Instead of writing blog posts or commenting on them, many science bloggers spend increasing amounts of time with these services, e.g. in The Life Scientists room on FriendFeed.
But microblogging and aggregation services have also emerged as new tools for another area of science communication, namely the peer review process. The interaction between authors and editors or editors and reviewers traditionally happens via email (because peer review is usually anonymous, authors don't communicate directly with reviewers). Twitter and similar tools fullfill the requirement for privacy (in the form of direct messages and private rooms and special services for organizations such as Yammer).
What are the advantages of these tools for the peer review process? All communications can be stored in one place in the form of a discussion thread. FriendFeed and Facebook allow users to mark posts they like and this can show agreement between reviewers. Messages can also be sent to and from non-traditional devices such as cell phones. Many senior researchers are already overworked with peer review, so this way they can at least post their reviews from the golf course or their yacht. And authors want to learn about their accepted paper as soon as possible, and this is not necessarily when they sit in front of their computer.
But most importantly, microblogging enforces brevity. Virginia Walbot recently complained in a Journal of Biology comment (Are we training pit bulls to review our manuscripts?) about reviewers
dismissing the years of labor and stating that the manuscript can only be reconsidered with substantially more data providing definitive proof of each claim.
As Twitter messages (also known as tweets) can only be 140 characters long, reviewers are forced to write short reviews, and editors to write short notes to the authors. And if the 140 characters aren't enough, they can always point to other places with services like bit.ly.
The DataCite MDC Stack
In May, the Make Data Count team announced that we have received additional funding from the Alfred P. Sloan Foundation for work on the Make Data Count (MDC) initiative. This will enable DataCite to do additional work in two important areas:Implement ...
The DataCite Technology Stack
DataCite is a DOI registration agency that enables the registration of scholarly content with a persistent identifier (DOI) and metadata. This content can then be searched for, reused, and connected to other scholarly resources. ...
Infrastructure Tips for the Non-Profit Startup
When I started as DataCite Technical Director four months ago, my first post (Fenner, 2015) on this blog was about what I called Data-Driven Development. The post included a lot of ideas on how to approach development and technical infrastructure. ...