There were presentations from eLife (Dr Wei Mun Chan) and F1000Research (Dr Sabina Alam, @Sab_Ra) in the Innovations in peer-review session. PeerJ was mentioned several times, for publishing their peer reviews, for example.
I general, I think the presenters did a good job in demonstrating modern peer review on how it can benefit authors and research in general: eLife with its consultative peer review, where editors and reviewers discuss their views and opinions before a decision is made, and F1000Research with their open post-publication peer review system. My personal experience with PeerJ (as a reviewer) and F1000Research (as a reviewer and author) have been excellent. All these journals are great venues for a modern open scholar.
Dr Jen Wright (@JennWrights) from Cambridge University Press presented a nice and detailed overview of how peer review works. I was well structured, following a FAQ model. She also very entertainingly illustrated her talk with references to PHDcomis, Lego Grad Student and Shit Academics Say.
Open peer review
The highlight of the day was Corina’s (@LoganCorina) brilliant Open peer review - what is it and what does it achieve? talk. She made a strong point in favour of open peer review and reviewing ethics. Read her lab code of conduct about reviewing ethics, as well as publishing ethics, her commitment to conducting rigorous science, lab interpersonal interactions.
I was nice to hear how her efforts in ethical publishing and reviewing proved to have been very positive for her academic career, which contrasts to the fear that some early career researcher sometimes express that practising open science and ethical publishing could hinder their careers.
The role of peer-reviewers in promoting open science
I was also very happy to have the opportunity to give a talk about the role of peer review in promoting open science. My slides are available here. I plan to write it up and expand on it in a blog post.
In brief, my main message was that, it we want to promote rigorous science, we have an obligation to make sure that the data, software and methods are adequately shared and described, and that it was not too difficult or time consuming to check this as a peer reviewer.
Currently, as far as data is concerned, my ideal review system would be a 2-stage process, where
- Submit your data and meta-data to a repository, where it get’s checked (by specialists, data scientists, data curators) for quality, annotation, meta-data.
- Submit your research with a link to the peer reviewed data.
My talk earned me a lot of feed back and encouragements, both off and online.
I had heard about Publons before, but never took the time to learn more about it. Tom Culley made a great job at presenting it as a means to Getting formal recognition for your peer review work. I will definitely give it a go in the near future.
Update 2017-09-13: Since I wrote this post, I created a Publons account, used it actively, and closed my account and requested my data back and asked for it to be deleted from the Publons servers. The reason is that Publons has been acquired by Clarivate Analytics, the company behind the impact factor. While I don’t doubt Publons’ good intentions, I refuse to contribute my review data to a company whose sole interest is to maximise profits. I would make use of Publons’ services, or a similar initiative, if I had reasonable guarantees that my data where used for the benefit of research and researchers. Due to the new circumstances, I believe that these are not relevant anymore.
Show me the data
I went to Dr Varsha Khodiyar’s (@varsha_khodiyar) workshop Show me the data : tips and tricks with peer-reviewing research data. Varsha is the data curation editor at Scientific Data. I am not necessarily a big fan of data journals (see here for some background), but it is clear that she is doing great work making sure that the data that she checks and curated (in addition to the peer review) is available under an open license and of good quality.
When it comes to data/software submissions, I believe that often, many shortcomings are more a result of lack of adequate skills or experience in the process of good practice in sharing and documenting, rather that poor quality of the output. The review process should ideally serve as a way to support and education researchers, and the Bioconductor and rOpenSci projects are great examples of how the package review process is a way to genuinely help the authors to improve on their output, rather than a binary accept/reject outcome.
A closed 2-stage peer review, like is typically in place in journals is a horrible system for than. An open review, with more interactions between reviewers and authors would be a more efficient approach.
More about the event
To hear more about the event, have a look at the #oscpeereview hashtag on twitter. The event was live streamed and will be made available on YouTube in the coming day - I will add a link later.
All in all, I think it was a great event. Kudos to the Office of Scholarly Communication for their efforts and continuous dedication. As emphasised by many participants, events like this constitute a unique and important channel highlighting important innovations in digital and open science that are redesigning scholarship. They are also a unique venue where open researcher can express and discuss challenges and opportunities with the wider academic community.