Plos

We Couldn’t Do It without YOU

Every year, we get to work with new authors, reviewers, and editors who are ushering in the next wave of scientific advancement. We love publishing your work, reading your reviews, and learning from your expertise and we just want to say THANK YOU for supporting PLOS.

Wow, did we really do all of that?

We did! This has been a banner year for PLOS journals. In 2018 we saw more research articles published in PLOS Biology than ever before, began publishing Topic Pages in PLOS Genetics and Benchmarking articles in PLOS Computational Biology, partnered with bioRxiv to post over 1,300 preprints, and committed to moving forward with published and signed peer review. That’s on top of all of the special issues, Calls for Papers, and collections we’ve published in topics ranging from Climate Change and Health to Gender and NTDs.

We’d also like to extend a warm welcome to more than 3,000 new members of the PLOS ONE Editorial Board who have joined us this year to provide more expertise for submission areas that need it most – we’re glad you’re here!

For everything we do at PLOS, we are supported by the dedication of our research communities.

Together, we’re stronger

We are a community of more than 8,000 editors, 65,000 reviewers, and 150,000 authors. When we work together, we can make change happen in scholarly communication. Last year PLOS Pathogens editors hosted six writing workshops to help Early Career Researchers improve their skills and equip them with the tools they need to become authors. We also hosted interactive events like live-streamed preprint journal clubs to bring authors and experts from the community together for real-time feedback on their work.

We’re listening to your feedback from our surveys, event meetups, and Section Calls and want to continue evolving our services in ways that matter to you.

We’re working on new ways for reviewers to get credit for their work through ORCiD as well as signed and published peer reviews. We’re also going to continue the process improvements we’ve started on PLOS ONE to bring a faster, clearer process to our authors along with a number of exciting new options on other journals – stay tuned!

Cite it, share it, celebrate it

For everyone who has contributed to our success this year: our dedicated Editorial Board, incredible Guest Editors, and inspiring reviewers – these articles are for you!

We’re sure we will have many more opportunities to thank you this year but please join us in celebrating your achievements this week by sharing your PLOS contributions with #PLOSCommunity.

Celebrate Open Data Day with Us!

 

Around the world tomorrow groups from all sectors will be celebrating Open Data Day – an annual event that highlights the benefits of open data and encourages the adoption of open data policies in government, business and civil society. As publishers, data availability is crucial for the validation and foundation of new research and key to our mission to help researchers advance the scientific record. In the spirit of Open Data Day, we’ve decided to defer to researchers and data enthusiasts to answer the questions: why is data important and how can we make it better?

We got our answers from Sudhakaran Prabakaran, one of a number of dedicated volunteers at Cambridge University known as the Data Champions who are helping advise members of the research community on managing their data. Read his thoughts below.

Who are the Data Champions? What kinds of data questions do you help researchers navigate?

I think it’s a fantastic forum. [We have] a lot of discussions and people exchange ideas and not necessarily just in the sciences but also in every other field. [Data management] can be kind of confusing, even simple things like can I put my [datasets] in Dropbox? Can I share them in Google Drive? You’re talking about even labeling stuff in desktop computers. There is no clarity because this landscape is fast-moving and people are not trained to catch up with that kind of speed at which things change.

What are you working on right now? How does open data play a role in it?

Our lab thrives on open data. We train machine learning algorithms that looks at specific regions of the human genome and trying to identify the most important mutations and then identify drugs to target them. Most of the datasets I work with people have already published and analyzed those datasets and they’ve extracted what they want. I’m kind of looking at things that they don’t want – I’m looking at non-coding regions, just kind of digging deeper into the datasets.

Why do you think open data is important? What do you think the future open data landscape looks like?

I don’t think open data is enough…it’s the analysis also. For example, we train a lot of machine learning algorithms and in the process we fail many, many times and we know the pitfalls we know what to avoid. But if you share that process with other people that will enable them to overcome it, to get there. It is very difficult and that process can be shared with people.

I think future young people are going to be brought up in an environment where they can just click something and get access to the code and get access to the data themselves. And then the issues of reproducibility would be mitigated if you can share what you’ve done and the data set is there for other people to work with.

What advice would you give to authors and researchers to encourage them to share their data?

I think we have encountered these scenarios even as a data champion in my own department. I think if you if you have incentives, as in [getting] your DOI and authorships for the dataset even before publication then it’s easy to share. It’s your data, [someone] can probably do a different kind of analysis and publish it but they have to cite this data and you will be benefited by that.

And it’s in the best interest of the authors to share it ahead of time because of reproducibility.

 

What can you do to encourage good data management?

You can practice the open data lifestyle by sharing your research data in an open repository and making it available when you submit your manuscript. If you’re reviewing a submission, knowing how to evaluate the associated datasets can be tricky which is why we’ve worked with the Data Champions to cover everything you need to know in this Reviewer’s Quick Guide to Assessing Datasets. If the manuscript you’re reviewing doesn’t have an associated dataset, request it!

About the Data Champions Program

The Data Champions Programme is a network of volunteers who advise members of the research community on proper handling of research data. In this, they promote good research data management (RDM) and support Findable, Accessible, Interoperable, and Re-usable (FAIR) research principles. It is run by the Research Data Management Facility at the University of Cambridge.

 

‘How do we define success?’ – Rethinking failure and success in science

Independent of the context, failure is a word that hardly ever leaves us indifferent. Fear of failure is human nature, and it is common that we prefer not to talk about failures if we can avoid it.  When we think about this in a professional context, failure can have clear and immediate ramifications for reputation and career progression and – as with any other professional – researchers are not immune to this fear of failure

Part of this approach to failure in research is due to the fact that the research system has traditionally rewarded those who are the first to report a finding over those who are second, and those who report a positive result over those reporting a negative one. However, research generally involves a trial-and-error approach and a plethora of negative findings, or protocols that require troubleshooting before they are fine-tuned. Thus ‘failed’ experiments are common; more so than is often recognized or reported. Much effort and many hours of meticulous research endeavour go unrecognized by the current research assessment frameworks, resulting in a considerable squandering of potentially important research outcomes.

The ‘Failures: Key to success in science’ event at the Cambridge Festival of Ideas 2018 aimed to reflect on these considerations in a conversation involving our five panelists and the audience around the notions of failure and success in science.

Our five panellists kicked off the conversation by giving their perspective on what a successful research career should look like.

‘How do we define success?’ asked Cathy Sorbara (Co-chair of CamAWiSE, Cambridge for Women in Science and Engineering), maintaining that science does not have a defined endpoint, that collaboration should be a key part of the research process, and that scientists need to think about how they communicate their work, particularly to those unfamiliar with research. Tapoka Mkandawire (PhD candidate, Sanger Institute) felt that a key aspect of success is to work on something that you feel passionate about and are keen to share. The audience was interested in the forms that communication of research could take and the panellists noted that communication about research should not be restricted to publications, putting forward ideas around visual formats such as videos. Tapoka noted that her research group has developed a comic book to more easily describe their work to children.

A common theme was that the binary classification of success vs failure is somewhat unfair. Should a result be tagged as failure only because it’s negative and not been published? Fiona Hutton (Head of STM Open Access Publishing at Cambridge University Press) advocated the development of a more collaborative open pathway for research, with more openness at all steps of the process, such as that demonstrated with open lab notebooks, to capture the incremental steps that make up the research process. The sharing of negative and null results should be encouraged as well, as a move away from frameworks that rely on impact factors to assess the quality of research; Fiona mentioned DORA as a good initiative in this space, which is gaining support from institutions and funders.

Arthur Smith (Deputy Manager of Scholarly Communication (Open Access), University of Cambridge) and Stephen Eglen (Reader in Computational Neuroscience, University of Cambridge) tackled the challenges with the current research system and acknowledged that this places Principal Investigators (PIs) as the ‘survivors’ of the system, with only a few reaching the top of a steep pyramidal career structure. Stephen stressed that the driving force for getting into research should be a genuine interest in science and not the goal to eventually become a PI. Arthur noted that there are many other career paths available after a PhD and that the skills gained can be used in many other areas, such as the private sector. The training of PhD students should include aspects that go beyond publishing, and should balance this with the development of communication and other skills.

To round up the discussion we asked panellists to provide recommendations for steps that can help shift perceptions about success and failure in science. Here is what they told us:

  • More support for early career researchers, so that they can have an informed, broader view of their career, and of the options after a PhD.
  • Further recognition for the wide range of different roles that scientists play beyond the publication of research findings – for example, peer review activities, mentorship, etc.
  • Provision of credit for recording and reporting troubleshooting, for any work that may not follow the shape of a conventional publication but which would help others engaged in related research.
  • More training for those in a research path, to help them develop a variety of transferable skills, and to recognize the value of those skills.
  • Increased diversity – higher diversity can only be beneficial in driving change towards how success is defined.

Achieving these aims and helping to sway current views about failures in research represents a formidable task, but – much like science itself – change progresses one step at a time and we hope that the engaging conversation at the Festival of Ideas provided one such step to shift how we define “success” in research. As we pursue initiatives towards such change, let’s remind ourselves of Arthur Smith’s definition of success: ‘Success is what makes you happy’.

 

From Preprint to Publication

Live preprint journal clubs provide early feedback for PLOS ONE authors

We love it when preprints go on to be accepted as formal journal publications and we are especially excited to announce that EMT network-based feature selection improves prognosis prediction in lung adenocarcinoma, a featured preprint in our Open Access week event, is now published in PLOS ONE!

In October, we celebrated OA Week’s theme of “Designing Equitable Foundations for Open Knowledge” by teaming up with PREreview to host virtual preprint journal clubs where researchers from around the world could share their expert opinions on preprints AND get credit for their reviews. Thanks to this event, the authors of this preprint received crowd-sourced feedback on their work even as their submission underwent formal peer review by PLOS ONE.

Lead author Borong Shao took advantage of the unique opportunity to participate in the discussion and we asked her to tell us what she thought of preprints and the virtual journal club experience. Read her thoughts below:

Can you tell us a little bit about your research? What made you decide to post the work as a preprint?

We were working on the topic of molecular signature identification using multiple Omics data. The reason why we posted our work was to let our new results reach the research community. Based on our experience, preprint works are also read and discussed by researchers, as well as the formally accepted ones.

How does your field or research community feel about preprints in general?

In my opinion, preprints are welcomed if the work has a great idea to share. This can assist or even inspire other researchers in their work without waiting for the article to be formally accepted.

Tell us about your experience discussing your preprint at a live journal club—How did you feel about the opportunity?

I was a bit nervous because I had no such experience before. I wondered whether the audience would have positive or negative opinions about my work, although I think my work has its value. I was excited too, because our work is read by researchers all over the world. Some of them are from a relevant but not the same discipline. I was curious to know their opinions on our manuscript.

Did you use any of the feedback from the virtual journal club? Did you find this kind of feedback useful in general?

Both my professor and me found the suggestions from the virtual journal club very helpful. They gave us useful advice from the viewpoints of both readers and researchers. Much of the feedback can be implemented in a short time to improve the quality of our work. Some other feedback can be learned and used in our future research. There were a few mistakes that we might not have found out, if not learned from PREreview feedback.

 

Preprints aren’t just helpful to authors– early comments from your community can also help editors at the journal conduct their evaluation of the work. PLOS ONE Academic Editor Aamir Ahmad had the opportunity to handle Dr. Shao’s submission and felt that the early feedback process was “a great initiative… the feedback was excellent in general and the authors did a good job of incorporating the changes.”

PLOS wholeheartedly supports preprints and the myriad benefits they offer researchers.  We’re making it easier for authors to share their work as a preprint, immediately upon submission, through our posting service in partnership with bioRxiv and we were happy to find another partner in PREreview who have pioneered live preprint journal clubs for early discussions like these to take place.

You can find more information on preprints here and live-streamed journal clubs here. Please also join us in congratulating Dr. Shao and her co-authors on their recent publication!

 

PLOS Provides Feedback on the Implementation of Plan S

We welcome Plan S as a ‘decisive step towards the realisation of full open access’1, in particular the push it provides towards realization of a research process based on the principles of open science. This is fully aligned with our mission to bring scientists together to share work as rapidly and widely as possible, to advance science faster and to benefit society as a whole. Our publications have operated in line with the core principles outlined in Plan S since the launch of our first journal, PLOS Biology, in 2003. We recognize that wide adoption of support for Plan S may bring additional competition within the open access publishing space. We welcome this evolution as a positive change in research culture, resulting in greater availability of information, growing inclusion in the scientific process and increasing the speed of discovery and innovation.  Below is our response to the call for public feedback.

Feedback Questions

  1. Is there anything unclear or are there any issues that have not been addressed by the guidance document?

While welcoming Plan S, its principles and stated intentions, there are some points where we believe additional clarification would be beneficial.

A. Changing research assessment

We are glad to see emphasis on changing research assessment and commitment to the principles of DORA as part of Plan S. We believe this is critical to enabling change in publication behaviours, allowing the value of research outputs to be assessed on their merits rather than through an aggregated metric based on publication venue. However, we note that although the original publication of Plan S states that members of cOalition S ‘commit to fundamentally revise the incentive and reward system of science, using the San Francisco Declaration on Research Assessment (DORA) as a starting point’1, the implementation guidance states only that ‘cOAlition S members intend to sign DORA and implement those requirements in their policies.’ We ask cOalition S to provide clarity regarding the steps that will be taken to drive the ‘fundamental change’ indicated in the original publication.

B. Transformative agreements

While recognizing the need for a route for subscription journals to transition away from publication behind paywalls, we believe that without stringent guidelines and compliance checks, ‘transformative deals’ may have significant unintended consequences, reducing choice and narrowing the market. As identified by Adam Tickell in his 2015 review2 the need for ‘OA policy to offer greater choice to research producers’ remains, and we believe this should be a primary consideration for cOalition S in considering the future shape of the research and innovation market, particularly as it relates to the assessment and communication of research findings.

‘Transformative agreements’ offer advantage to the largest players and to publishers with substantial subscriptions business, as smaller publishers have to ‘wait in line’ to enter negotiations while those, including but not limited to PLOS, without legacy subscription businesses cannot participate. We acknowledge that the intention of cOalition S members is that ‘transformative agreements’ should not decrease the amount of money available in the system to fund publishing in other compliant venues, however, we believe this is the likely outcome as limited institutional and library publication budgets become tied into large ‘read and publish’/’publish and read’ (RAP/PAR) deals. This perpetuates the dominance of the ‘big deal’ in the market, which in its rebranded ‘publish and read’ form, has the potential to become the status quo rather than a step towards transformation, much as hybrid journals have become the status quo in relation to open access. Moreover, the transition of subscription ‘big deals’ into ‘RAP/PAR’ deals risks locking the high cost of subscriptions into an open access future, if deals so far are anything to judge by. We would like to see a ‘clear and time-specified commitment to a full Open Access transition’ as outlined in the implementation guidelines, be a central requirement for all journals covered by a ‘transformative agreement’ in order to be considered compliant with Plan S. We also ask for greater clarity on the allowed start and end dates for these agreements.

C. Deposition in open repositories

While we understand that this is a recommendation rather than a mandatory criterion for compliance with Plan S, we believe that the proposal that there be ‘direct deposition of publications by the publisher into Plan S compliant author designated or centralised Open Access repositories’ has the potential to add cost and complexity to compliance.

Currently, we and many other publishers syndicate our published articles to PMC/Europe PMC. The process of direct deposition to each repository is not without cost, requiring both staff and technical resources to set up and to run. These costs will increase should it become necessary to deposit to a range of ‘author designated’ repositories. This is especially the case given the importance of equitable treatment of publications from researchers in different disciplines and/or geographical regions, particularly as cOalition S grows.

We encourage cOalition S to reconsider this recommendation and propose deposition in a small number of recognized repositories or dispatch services, to facilitate compliance.

D. Publication costs and APC caps

In the published guidance, cOalition S calls for ‘full transparency and monitoring of Open Access publication costs and fees’ and indicates the potential for ‘standardisation of fees and/or APC caps’. We understand that the cOalition has revised this position and intends to call for transparency but not to introduce set caps. We welcome this change of approach which we would like to see reflected in the next iteration of the written guidance. We believe that requiring transparency will allow funders, or others paying the costs of publication, to assess the value of their payments while minimizing the opportunity to give rise to unintended consequences.

In considering potential unintended consequences, there is a useful parallel with the introduction of tuition fees at universities in England. Since tuition fees were introduced in 1998, they have been capped by the UK government. According to a House of Commons Library briefing paper3, each time that the cap has been raised, almost all English HEIs have increased their fees to the maximum allowed level. When it was announced that the cap would increase to £9,000 from 2012, Lord Willetts, then Minister for Universities and Science, said that the maximum fee would be charged only in ‘exceptional circumstances’4 and it was anticipated that this would ‘create a market in fees’5. This market did not emerge and in fact, nearly all HEIs set their fees at the maximum allowed rate. We believe that there is significant potential for an analogous situation to emerge in relation to APCs. Rather than creating a market where publishers set APCs at the lowest level that covers their costs sustainably, it is more likely that caps would encourage APCs to be set at the maximum allowed level even if this is substantially higher than the publisher’s costs.

Additionally, the cost associated with the publication of an individual article is highly variable dependent on publication venue. The level of editorial activity, including building relationships with, and providing support to, authors, referees and academic editors is a significant contributor to cost but generates substantial value for the research community. The level of selectivity of the journal or platform is also an influencing factor, as more selective publications incur additional costs through assessing articles that do not go on to be published in that venue. While we recognize and support the need to change the measure of selectivity from one focused on journal impact factors, we believe that the ability to differentiate levels of selectivity based on appropriate and meaningful criteria should continue, where selective  journals and platforms can demonstrate their value through community engagement and cost transparency. We believe that this will support a thriving research and innovation ecosystem more effectively than moving to a ‘one size fits all’ approach.

  1. Published simultaneously as follows: a) M. Schiltz, available from https://www.scienceeurope.org/wp-content/uploads/2018/09/cOAlitionS.pdf; b) Schiltz M (2018) Science without publication paywalls: cOAlition S for the realisation of full and immediate Open Access. PLoS Biol. 16(9): e3000031. https://doi.org/10.1371/journal.pbio.3000031; c) Schiltz M (2018) Science Without Publication Paywalls: cOAlition S for the Realisation of Full and Immediate Open Access. PLoS Med 15(9): e1002663. https://doi.org/10.1371/journal.pmed.1002663; d) Schiltz M (2018) Science Without Publication Paywalls: cOAlition S for the Realisation of Full and Immediate Open Access. Front. Neurosci. 12:656. doi: 10.3389/fnins.2018.00656
  2. Open access to research publications Independent Advice, Professor Adam Tickell Provost and Vice-Principal, University of Birmingham Chair of the Universities UK Open Access Coordination Group, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/499455/ind-16-3-open-access-report.pdf
  3. House of Commons Library, Briefing Paper, Number 8151, 19 February 2018 Higher education tuition fees in England, https://dera.ioe.ac.uk/31330/1/CBP-8151%20_Redacted.pdf
  4. As above p.6
  5. As above p.10, Section 3.2

 

  1. Are there other mechanisms or requirements funders should consider to foster full and immediate Open Access of research outputs?

We believe that diversity and equality of opportunity, including for new entrants to the market, should be retained and encouraged to ensure to a thriving and diverse research and innovation ecosystem. Referring to part (B) of our answer to Question 1, we encourage cOalition S to consider this opportunity to move from ‘big deal’-style arrangements as rapidly as possible to avoid further consolidation around the largest players in the market.

Focusing on regulation of existing business models, both transformative agreements and APCs, may have the unintended consequence of creating barriers to the diversification in the market. We applaud the the support indicated in the implementation guidance for ‘a diversity of models and non-APC based outlets’ and encourage cOalition S to ensure equal emphasis on the development of new business models, alongside consideration of established approaches. We believe this is vital in order to maintain choice for researchers.

Boosting Open Science Hardware in an academic context: opportunities and challenges

Written by: Jenny Molloy (University of Cambridge), Juan Pedro Maestre (University of Texas, Austin)

Experimental science is typically dependent on hardware: equipment, sensors and machines. Open Science Hardware means sharing designs for this equipment that anyone can reuse, replicate, build upon or sell so long as they attribute the developers on whose shoulders they stand. Hardware can also be expanded to encompass other non-digital input to research such as chemicals, cell lines and materials and a growing number of open science initiatives are actively sharing these with few or no restrictions on use.

A growing number of academics are developing and using open hardware for research and education in addition to sharing their papers, data and software through broader open research practices. This brought a large cohort to the Gathering for Open Science Hardware (GOSH) in Shenzhen China during October 2018, an four day event which convened over 110 of the most active users and developers of open science hardware from 34 countries and multiple backgrounds including academia, industry, community organising, NGOs, education, art and more. PLOS kindly supported an unconference session during GOSH 2018 where students and researchers shared the following opportunities and challenges to boosting open science hardware in an academic context and planned a course of action to forward the goal of the Global Open Science Hardware Roadmap to make open science hardware ubiquitous by 2025.

Opportunities for open science hardware in academia

Open science hardware has some important intrinsic benefits. Firstly, it can reduce the cost of research, democratising opportunity and enabling limited budgets to stretch further. Joshua Pearce of Michigan Tech University has calculated a return on investment of hundreds to thousands of percent for funders of open hardware through a drastic reduction in lab costs. Secondly, it reduces duplication of effort by building on the work of others and thirdly, it provides opportunities to customise hardware to suit your optimal experimental design, rather than designing your experiment to fit the limitations of available hardware. Moreover, sharing more details of experimental designs facilitates replicability in science. This is needed more than ever given current lack of trust towards science in some societal contexts and fears within several scientific communities of a “reproducibility crisis”.

Gaining additional credit, citations and collaborations are all significant potential opportunities for academics developing open science hardware and are necessary to incentivise those activities. However, cultural change is required within existing systems of academic publication and reward to realise the opportunities. Change is coming, for example the recently established Journal of Open Hardware and HardwareX encourage formal publication of research advances and designs that well documented and appropriately licensed, while the PLOS Open Source Toolkit channel highlights and rewards open hardware publications. We know that open approaches can reap rewards but there is room for further evidence in the hardware context. Open access publications and shared datasets can confer a citation advantage and many projects developing open research tools projects report high numbers of collaborations and significant funding that may not have been possible without their culture of sharing. The Structural Genomics Consortium is involved in publishing over two papers per week, partially a result of hundreds of collaborations through making data and tools freely available. Research funders can be responsive to openness as a strategy to maximise impact: UK-based research centre OpenPlant was awarded £12m to make open technologies for plant synthetic biology and two open source projects on diagnostics for infectious diseases were awarded >£1m from the UK’s Global Challenges Research Fund.

Educational use of open science hardware also reaps both tangible and intangible benefits for universities. It represents an opportunity to increase the quality of teaching and learning by providing access to instruments that would otherwise be too expensive in the numbers required for effective teaching. It also contributes to building critical thinking skills and breaking open the “black box” of laboratory equipment. There are many academics in the GOSH Community involving their students directly in developing open science hardware, such as air quality sensors at the University of Texas Austin or biological instrumentation through the Biomaker Challenge in Cambridge. Still others such as the Centro de Tecnologia Acadêmica at UFRGS in Brazil are using open hardware tools extensively in student lab practicals and research projects.

Challenges to address if open science hardware is to become ubiquitous

There are several barriers to wider adoption of open science hardware in academia. One stumbling block is institutional buy-in and support: in these times of limited funding, many universities have become conservative about approaches to intellectual property and patenting of inventions. Encouraging an open approach to maximising societal and scientific impacts through technology and knowledge transfer requires a compelling narrative. This includes reassurance that openness is contextual. In some cases the traditional route of IP protection and restrictive licensing may be optimal to achieve intended outcomes, in others it is not and  open approaches should be considered a strategic option. It is also important to emphasise that open does not equal non-commercial. Indeed there are many examples of entrepreneurial academics and companies spinning off to sell open hardware back into academia but also to industry, non-profits, educational institutions and directly to the public.

Funding for ongoing support and scaling of open science hardware efforts is a perennial and important topic of discussion at GOSH. In the case of open science hardware, private investors may not consider open designs as maximizing profit opportunities but they can still be profitable and generate significant social and scientific returns. A major task for the GOSH academic working group formed at the unconference session is therefore to compile justification for a diverse range of funders including private philanthropists, social impact investors and venture funds to support open science hardware and further the goal of making it ubiquitous and widely used by 2025.

The final topic of discussion during our session was creating awareness among the scientific community both online and offline at major scientific conferences. Offering community-level incentives, support and guidelines to document and share open science hardware is feasible and there is much low-hanging fruit. However, we have seen in other areas of open research that to obtain ubiquity these community efforts need to be backed by formal incentives and rewards. In other words, the value of open approaches has to be recognised in decisions around funding, promotions and hiring decisions.

Furthering open science hardware through community action

Four priority actions emerged which correspond closely to recommendations in the Global Open Science Hardware Roadmap: i) leverage the GOSH Community and network to produce guidance and case studies for universities, funders and other stakeholders; ii) put open science hardware on the agenda at large disciplinary conference; iii) raise awareness through mainstream academic channels; and iv) take the initiative within our own institutions to experiment with ideas and build local communities.

We invite anyone who are interested in open science hardware to join this work to ensure that more researchers, students and those outside of academia have access to vital enabling technologies for science. You can sign the GOSH manifesto, join the GOSH Forum to share your projects and contact organizers@openhardware.science for more information.

NOTE: The PLOS Open Source Toolkit collects papers from across publishers that describe software and hardware with research applications. The site is curated and managed by five active researchers, including the author of this blog post, Jenny Molloy. Meet all the editors here and here.  We’re on a mission to make exciting, cost-effective, and high-utility tools accessible to all researchers to eliminate barriers to scientific innovation and increase reproducibility. We post new content monthly. Subscribe for notifications. Currently featured: an open source K-mer based machine for subtyping HIV-1 genomes.

Acknowledgements

Many thanks to PLOS for their kind support enabling people in need of financial support to attend GOSH and to the participants in the unconference session: Juan Pedro Maestre (University of Texas, Austin), Pierre Padilla (UPCH), Andre Chagas (University of Sussex), Jenny Molloy (University of Cambridge), Moritz Riede (University of Oxford), Benjamin Pfaffhausen (Freie Universität Berlin), Marina de Freitas (CTA-UFRGS), Minerva Castellanos Morales (Scintia), Tobias Wenzel (EMBL), Anne-Pia Marty (University of Geneva), Alex Kutschera (Technical University of Munich), Eduardo Padilha (University of São Paulo).

Images

GOSH 2018: https://www.flickr.com/photos/goshcommunity/44847829654/in/dateposted/

Other GOSH images and credits can be found here.

Illustrations from the GOSH Roadmap can be found here.

All Gathering for Open Science Hardware photos and roadmap images are in the public domain under a CCZero waiver and available on Flickr https://www.flickr.com/photos/goshcommunity/

FluoPi: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0187163

(one frame, other options in the figures)

Image credit: Nuñez et al (2017), licensed under CC-BY 4.0.

Caption: Bacteria and cell-free protein expression systems generating fluorescent proteins and imaged using the FluoPi.

OpenFlexure Scope: Openflexurescope.jpg

Image credit: Dr Richard Bowman, University of Bath

Caption: Open source, 3D-printed microscope stage imaging onion cells on a Raspberry Pi camera. The stepper motors enable focusing and moving of the sample stage.

Public Lab: https://publiclab.org/system/images/photos/000/020/856/large/IMG_20170622_113106_860_2.jpg

Image Credit: Public Lab, licensed under CC-BY-SA 3.0

Caption: Members of Public Lab balloon mapping oil spills and water pollution with open source kits.

DropBot: https://www.microfluidic-chipshop.com/wp-content/uploads/2018/02/DropBot-DB3-120-998×1024.jpg

Image Credit: Sci-Bots Inc.

Caption: Open hardware digital microfluidics system made by Sci-Bots.

 

PLOS Board Appointments

 

After a careful search and much consideration, we are excited to share with our community five new appointments we’ve made to the PLOS Board. This is a pivotal time for PLOS, and as you’ll see, each member will bring us a different perspective, which will enable us to expand the ways in which we serve our scientific communities.

Our new Board Chair is Alastair Adam, currently CEO of innovative digital textbook publisher, FlatWorld, who brings to the role not only a strong understanding of publishing – including scientific journals – but also his business savvy and strategic skills. Alastair joined the Board effective November 1 and assumed the Chair role on January 1, 2019, replacing our longtime Board Chair, Gary Ward (more on Gary a little later).

We also added Dr. Simine Vazire, who is currently a Professor in the Department of Psychology at UC Davis where her research focuses on one of the oldest and most fundamental questions in psychology: how do we know ourselves? In 2017, she was awarded a Leamer-Rosenthal Prize for Open Social Science in recognition of her efforts to advance reproducibility, openness and credibility in the social sciences. She held a previous role as a senior editor of Collabra: Psychology and Editor-in-Chief of Social Psychological and Personality Science. Her scientific and editorial expertise bring a well-rounded and diverse perspective to our Board, and will help to ensure that working scientists retain a strong voice on our Board.

Dr. Victoria Coleman joined the Board in May 2018. She is currently the Chief Technology Officer at the Wikimedia Foundation where she sets the organization’s technical roadmap for the evolution, development, and delivery of core platforms and architecture.  Victoria brings valuable technology experience to the Board at a time when PLOS, like many mid-size publishers, faces important and difficult choices about its technology infrastructure. Victoria serves in several advisory roles including the Board of the Santa Clara University Department of Computer Engineering and as Senior Advisor to the Director of the University of California Berkeley’s Center for Information Technology Research in the Interest of Society.

We also wanted to ensure that we maintain deep experience in PLOS’ core biomedical science fields, and we are very lucky to have Professor Keith Yamamoto of UCSF agree to join us (effective February 1, 2019). Keith is both a highly regarded scientist running his own research lab and has extraordinary experience in the policy arena focusing much of his career on science practice, education, communication, and advocacy including strong and early support for OA. He currently serves as UCSF’s first vice chancellor for Science Policy and Strategy.

Last but by no means least, Suresh Bhat joined us on November 1, 2018 as incoming Chair of the Finance Committee. Suresh brings to PLOS not only deep financial knowledge but also experience at a top research university and a passion for education. Suresh has headed finance programs for a number of financial institutions. He is currently CFO and Treasurer at the Hewlett Foundation, prior to that, he was CFO at the Haas School of Business at UC Berkeley (and is a Haas and Cal alum).

I would be remiss if I did not take the opportunity here to express my heartfelt thanks to both Gary Ward, our now former Board Chair, and Mike Eisen, one of the co-founders of PLOS, both of whom left the Board in 2018.  In his seven years as Board Chair, Gary has led the Board with passion, wisdom and integrity, and has been both counsel and friend to many of us in the organization. Mike is of course irreplaceable in every way. His vision, zeal and dedication are a big reason that PLOS not only exists but has had such a deep impact on scientific communication. I have no doubt that Mike will continue to be one of PLOS’ greatest advocates (and yes, let us know when we get it wrong – as good friends do!).

While goodbyes are never easy, we are excited to embark on this new chapter for PLOS with the fresh wisdom of so many exceptional, dedicated individuals. Please join us in welcoming our new Board members!

PLOS Authors Say “Yes” to Preprints

We’ve surpassed 1,300 preprint posts to bioRxiv!

This is an incredible milestone for us and for all of our authors who chose to opt-in to our preprint service since we announced our partnership with Cold Spring Harbor Laboratory’s bioRxiv six months ago. We wanted to bring an easy preprint-posting option directly to the submission process for our authors and are thoroughly excited with the results we’ve seen so far.

The road to preprints

As we began this journey, about 4% of our authors reported that they had posted their submission to a preprint server. While this base remains consistent, our preprint-posting service has built upon it to offer authors more choices. In the past six months we’ve seen an additional 14% opt-in to have PLOS post a preprint on their behalf, indicating that 18% of our authors want to use preprints to share their research.

Of course, the opt-in rate varies by discipline. On PLOS Computational Biology 46% of our authors choose to make a preprint of their manuscript available, with half of those posting before submission and the other half requesting PLOS post to bioRxiv on their behalf. In biology in general, the adoption is high. PLOS Biology, which joined the service later, is already showing a promising trend towards preprints by 39% of our authors (23% of which elect to have PLOS post on their behalf).

Every opt-in we get is screened by editorial staff before posting to ensure the article fits bioRxiv’s scope and that no sensitive information is accidentally shared. We have also taken a conservative approach and avoided posting research that could have an impact on human health before the claims have been peer reviewed, which is why we do not yet offer to post preprints for PLOS Medicine authors. We’re working in partnership with bioRxiv to refine the posting criteria as we learn more about the needs for early sharing in different communities.

Overall, the openness to new research outputs we’ve seen among our community of authors is inspiring and we hope to see preprint adoption grow even more over the coming year.  

 

Author choice

We like preprints because they put your research first. We’re making it easier for you to choose preprints as a way to rapidly disseminate your research results, establish priority, accumulate citations for your work, and receive input from your community that may help shape the future of your research.

That said, preprints aren’t for everyone or for every paper which is why authors choose when and how their work becomes available. We’re also listening to our community’s feedback to make our service as inclusive as possible.

Many of our authors still prefer to wait for peer review before making their results public. However, about a fifth of the authors who responded to a survey about why they had opted out said they are unfamiliar with preprints. We’re hoping to change that by offering everything you need to know at plos.org/preprints. More information about preprints is available on bioRxiv along with their posting guidelines. ASAPbio also offers very useful guidance for preprints, including preprint policy at other journals which may help clarify any concerns you have about submitting a manuscript after you’ve posted a preprint.

Where we go from here

We’ll continue learning from our community and sharing more information that helps you make the right decision for your paper. We’re also encouraging other preprint options to authors in areas that don’t fall under bioRxiv’s scope. Both PLOS Genetics and PLOS ONE have dedicated Preprint Editors to solicit submissions from various preprint servers and we’re looking at more opportunities.

If you’re thinking of posting a preprint for the first time, take advantage of this checklist to get started and review all the benefits preprints could have for your work.

 

Attention Earth Sciences: PLOS ONE wants YOUR Preprint

Dedicated team of Editorial Board Members are now actively seeking manuscripts in the Earth Sciences from preprint servers EarthArXiv and ESSoar.

Preprint servers offer a myriad of benefits to authors who are excited to share their work with the community as soon as possible, so we’ve offered our authors the ease of automatically posting their life science submissions on bioRxiv. But PLOS ONE is a community of many different voices and we want to help promote preprints in all disciplines. This includes providing authors with more reasons to post a preprint – on top of the advantages that posting a preprint already offer such as faster dissemination and allowing for input from the whole community. We’re therefore delighted to announce the introduction of a new program to invite submissions of posted preprint manuscripts specifically in the Earth and Space sciences. Our aim is to support authors posting their papers with a fast and efficient peer review process and journal publication of their work.

Introducing PLOS ONE Preprint Editors

Going forward, we’ve tasked a small group of PLOS ONE Editorial Board members with reviewing and inviting preprint submissions from EarthArXiv and ESSOAr that they feel would be a good fit for the journal. This group will be led by Section Editors Guy Schumann (Bristol University, UK) and Juan Añel (University of Vigo, Spain) along with dedicated Preprint Editors, Xialoe Sun (Stockholm University, Sweden) and Julien Bouchez (Institut de Physique du Globe de Paris, France).

As part of this program, submissions invited through preprint servers will receive special attention from the staff editors which may include extra promotion on social media. Climate change papers may also be recommended to the “Responding to Climate Change” Channel, of which Juan Añel is also an editor.

“As a preprint editor one can have a substantial positive impact and contribute a potentially very high added-value to the scientific community of a particular research field.”

  • Guy Schumann, Section Editor PLOS ONE

We are truly excited to place this program in the hands of these individuals who’ve proven their dedication to their communities and eagerness to advance scholarly outputs for scientific communication in the Earth and Space sciences.

Why we choose preprints

Recruiting research from preprint servers is nothing new in academic publishing, other journals like PLOS Genetics and eLife already do so. Preprints represent huge opportunities for improvement on slow publication times. When it comes to critical issues like climate change and others, getting results out sooner can have a dramatic impact on our ability to advance science and foster early collaboration and debate on new research results.

“For me, a main advantage of preprints is that they can help to advance science faster, with public exposure of what is going on, what is cutting-edge”

  • Juan Añel, Section Editor PLOS ONE

I’ve never posted a preprint before, should I?

Yes! The benefits are endless. Preprints are an easy way to generate exposure for your research before you even decide where to submit (ESSOAr also accepts uploads of conference posters and other materials). When you post a preprint, you have immediate and unlimited reach allowing you stake the first claim on your methods and results, and even get early feedback from your community. Sounds great, right? Preprints are also beneficial for early career researchers who need discoverable, citable content that speaks to their academic contributions and can help advance their careers.

“Particularly for young scientists, who are the major driving force for science today and need a… good publication record to look for their next job, preprints would be a very [good] choice for them to publicize their findings in a timely way and “decorate” their CV”.

  • Xiaole Sun, Preprint Editor PLOS ONE

We encourage you to join us in our support of preprints, not just in the earth sciences but across all disciplines. Preprints are already one of the fastest growing research outputs, and we can all do our part to making it an even more successful outlet for new communities that are just beginning to explore its potential.

 

Towards minimal reporting standards for life scientists

A group of journal editors and experts in reproducibility and transparent reporting are putting together a  framework for minimal reporting standards in the life sciences. Part of this group, PLOS Executive Editor Veronique Kiermer shares a joint announcement.

Transparency in reporting benefits scientific communication on many levels. While specific needs and expectations vary across fields, the effective use of research findings relies on the availability of core information about research materials, data, and analysis. These are the underlying principles that led to the design of the TOP guidelines, which outline a framework that over 1,000 journals and publishers have elected to follow.

In September 2017, the second major TOP guidelines workshop hosted by the Center for Open Science led to a position paper suggesting a standardized approach for reporting, provisionally entitled the TOP Statement.

Based on discussions at that meeting and at the 2017 Peer Review Congress, in December 2017 we convened a working group of journal editors and experts to support this overall effort by developing a minimal set of reporting standards for research in the life sciences. This framework could both inform the TOP statement and serve in other contexts where better reporting can improve reproducibility.

In this “minimal standards” working group, we aim to draw from the collective experience of journals implementing a range of different approaches designed to enhance reporting and reproducibility (e.g. STAR Methods), existing life science checklists (e.g. the Nature Research reporting summary), and results of recent meta-research studying the efficacy of such interventions (e.g. Macleod et al. 2017; Han et al. 2017); to devise a set of minimal expectations that journals could agree to ask their authors to meet.

An advantage of aligning on minimal standards is consistency in policies and expectations across journals, which is beneficial for authors as they prepare papers for publication and for reviewers as they assess them. We also hope that other major stakeholders engaged in the research cycle, including institutional review bodies and funders, will see the value of agreeing on this type of reporting standard as a minimal expectation, as broad-based endorsement from an early stage in the research life cycle would provide important support for overall adoption and implementation.

The working group will provide three key deliverables:

  •       A “minimal standards” framework setting out minimal expectations across four core areas of materials (including data and code), design, analysis and reporting (MDAR)
  •       A “minimal standards” checklist intended to operationalize the framework by serving as an implementation tool to aid authors in complying with journal policies, and editors and reviewers in assessing reporting and compliance with policies
  •       An “elaboration” document or user guide providing context for the “minimal standards” framework and checklist

While all three outputs are intended to provide tools to help journals, researchers and other stakeholders with adoption of the minimal standards framework, we do not intend to be prescriptive about the precise mechanism of implementation and we anticipate that in many cases they will be used as a yardstick within the context of an existing reporting system. Nevertheless, we hope these tools will provide a consolidated view to help raise reporting standards across the life sciences.

We anticipate completing draft versions of these tools by spring 2019.  We also hope to work with a wider group of journals, as well as funders, institutions, and researchers to gather feedback and seek consensus towards defining and applying these minimal standards.  As part of this feedback stage, we will conduct a “community pilot” involving interested journals to test application of the tools we provide within the context of their procedures and community. Editors or publishers who are interested in participating are encouraged to contact Veronique Kiermer and Sowmya Swaminathan for more information.

In the current working group, we have focused our efforts on life science papers because of extensive previous activity in this field in devising reporting standards for research and publication.  However, once the life science guidelines are in place we hope that we and others will be able to extend this effort to other areas of science and devise similar tools for other fields.  Ultimately, we believe that a shared understanding of expectations and clear information about experimental and analytical procedures have the potential to benefit many different areas of research as we all work towards greater transparency and the support that it provides for the progress of science.

We are posting this notification across multiple venues to maximize communication and outreach, to give as many people as possible an opportunity to influence our thinking.  We welcome comments and suggestions within the context of any of these posts or in other venues.  If you have additional questions about our work, would like to be informed of progress, or would like to volunteer to provide input, please contact Veronique Kiermer and Sowmya Swaminathan.

On behalf of the “minimal standards” working group:

Karen Chambers (Wiley)

Andy Collings (eLife)

Chris Graf (Wiley)

Veronique Kiermer (Public Library of Science; vkiermer@plos.org)

David Mellor (Center for Open Science)

Malcolm Macleod (University of Edinburgh)

Sowmya Swaminathan (Nature Research/Springer Nature; s.swaminathan@us.nature.com)

Deborah Sweet (Cell Press/Elsevier)

Valda Vinson (Science/AAAS)

 

Manuscripts Selected For Live-streamed Preprint Journal Club Event

We are teaming up with PREreview during Open Access week to bring together scientists from around the world to discuss and review an actual preprint…live-streamed! We have now selected a manuscript for each discipline and finalized the moderators. Read the original blog here. Details about how you can register are at the bottom of the page.

Neuroscience – Monday, October 22, 9am PDT / 12pm EDT / 5pm GMT+1

 Bioinformatics – Tuesday, October 23, 9am PDT / 12pm EDT / 5pm GMT+1

Ecology – Wednesday, October 24, 9am PDT / 12pm EDT / 5pm GMT+1

REGISTER NOW to join us and invite others to come along. You can choose to join any (or all) of the subject areas listed above. We are using the video conference software Zoom, which is free to download. Please register and we will send you the information on how to join the calls.

Get ready to spend one hour with your fellow colleagues diving straight into the preprint and resurfacing with constructive feedback for the authors. This is a great opportunity to share your expertise with your peers from all over the world, learn about preprints, build your network, and get credit for your feedback.

Gathering Steam: Preprints, Librarian Outreach, and Actions for Change

Note: This post was written by Robin Champieux, Research Engagement and Open Science Librarian at OHSU.  Robin is the co-founder of the Metrics Toolkit and Awesome Foundation Libraries Chapter.  Her work and research is focused on enabling the creation, reproducibility, accessibility, and impact of digital scientific materials.

Librarians, are you talking about preprints? A preprint is a complete scientific article posted on a public server before peer review.  Preprints speed dissemination and encourage early feedback.  But, this post isn’t about defining preprints and their value.  It’s October, I’m a scholarly communication librarian, so I’ve been panicking–I mean thinking–about what do for Open Access Week.  This year, I’m focusing my outreach efforts on preprints and I want to tell you why.

I am passionate about open science and realizing its benefits, but I am just as passionate about supporting student, faculty, and institutional success.  These goals, which are increasingly aligned, require a deep understanding of the scholarly communication and research landscape and right now preprints are a center of conversation in this space1.  Funders like the NIH and Helmsley Charitable Trust are encouraging researchers to share and cite preprints, and the Chan Zuckerberg Initiative is even requiring it:

“To encourage rapid dissemination of results, any publications related to this funded work must be submitted to a preprint server, such as bioRxiv, before the first submission to a journal.”2 

Journals are accepting manuscripts previously posted as preprints, inviting submissions from preprint servers, and linking to preprint versions of papers under consideration3.  The number of published preprints is rising4 and stakeholders are responding with support and concern5.

Researchers have to navigate this context and make decisions about how to share their work:  librarians can contribute to their success and affect change by responding to this need. In my experience, most researchers encounter and experiment with innovations in scholarly communication via specific points of choice or pressure. Librarian led outreach related to preprints can lead to conversations that catalyze a deeper interest in scholarly communication issues and changes. Collectively and over time these small and personal experiments and the discussions that surround them (including critical ones) help shift the needle towards openness.

I also believe in a definition of and critical approach to open science (and librarianship) that acknowledges how history, inequality, and privilege influence our scholarly communication practices and priorities.  Author and NYU Scholarly Communication Librarian April Hathcock advocates for flipping the script on how we view open:

“Rather than looking at it as a means of getting mainstream scholarship out to the margins, instead I want us to see it as a way of getting scholarship from marginalized communities into our mainstream discourse.”6

Preprints sit within a larger and evolving discussion about how scholarship is communicated and endorsed. I am excited about initiatives, like PREreview, that communities are developing around preprints to address issues of inclusion and representation in peer review.  From this perspective, preprints are a tool for democratizing “access to [and participation in] science on a global scale”7.  As librarians, we can support these projects and invent new ones that leverage preprints to take up April’s challenge.

So, what can you do to raise the volume on the “power of the preprint” and address the reasons why some researchers are reluctant to share them?:

  1. Do a deep dive on the preprints landscape.  ASAPbio’s preprint info center is a great starting place.
  2. Connect with your institution’s preprint champions and ask them about their motivations.
  3. Use your preprint and institutional knowledge to launch an engagement and demystification campaign.  Low bandwidth? Keep it simple by promoting existing events and reusing the work of others.
  4. Start a discussion with organizations on your campus working to address inequities and increase diversity in science about this year’s Open Access Week theme and the role preprints might play.
  5. Model the shift to open!  Share your own scholarship on LISSA, e-LIS and other preprint servers.

 

Failures: Key to success in science

This post was collaboratively written by PLOS staff (Ines Alvarez-Garcia, Phil Mills, Leonie Mueck and Iratxe Puebla)

Note: Join us Monday, October 15 at the Cambridge Festival of Ideas for a free interactive discussion with panelists from CamAWISE, University of Cambridge, the Sanger Institute, and Cambridge University Press to talk about how we can shift perceptions on failure and success in scientific careers.

Most scientists are fortunate to have a job that they love and feel passionate about, but a ‘successful’ career in research can be among the hardest career paths to pursue – dragons hide in almost every corner. In spite of many hours of effort and meticulous protocols, experiments fail and careful measurements yield unexpected results that one’s hypotheses cannot explain. There is also the risk that the same or related results are suddenly published by another group. These events are often regarded as failures in a research environment, but are they really failures? Could they actually hold the key to success in science?

There is scope to redefine what failure and success mean in science, and there are signs that the recipe for having a successful research career is changing. Being ‘scooped’ by a ‘competitor’ actually provides confirmation for your results, so shouldn’t that be viewed as support for the validity of your findings? Disseminating results fast in all of their forms, positive or negative – for example, as preprints – can also contribute significantly to the advancement of science; sharing all scholarly outputs allows others to build on the findings and prevents duplication of efforts.

Researchers are often constrained by a metric-based evaluation system that doesn’t reflect all the nuances, interactions and efforts that contribute to research endeavours. But some institutions and funders have started looking at their assessment framework afresh with a view to encouraging a more responsible use of metrics in the context of researcher assessment. At the same time, studies have consistently shown that there’s a lack of diversity in many scientific disciplines. In today’s landscape, where we have more options to travel and share information than ever before, we should move towards a place where diversity and collaboration are both sought and rewarded.

We will tackle all these questions and many more in the Cambridge Festival of Ideas event, “Failures: Key to success in science”. We’ll host an interactive discussion to engage the public and bring forward recommendations for the research communities on how we can better celebrate efforts and discoveries that do not currently fit the mould of a ‘successful’ research career. Our five panellists will help us explore the topic from their different perspectives:

Stephen Eglen, a reader in Computational Neuroscience (University of Cambridge)

Fiona Hutton, a publisher (Open access journals, Cambridge University Press)

Tapoka Mkandawire, a PhD candidate (Sanger Institute, Cambridge)

Arthur Smith, Acting Joint Deputy Head of Scholarly Communication (University of Cambridge)

Cathy Sorbara, Co-chair of CamAWiSE (Cambridge Association for Women in Science and Engineering)

If you’re in the Cambridge area, why not join us and participate in the debate, which will take place on Monday 15th October at Anglia Ruskin University, Lord Ashcroft Building Room 002. Book your free ticket here. Can’t attend? We’ll be live tweeting during the event. Check us out at #RethinkFailure. The Festival of Ideas hashtag is #cfi2018. We’ll also recap the entire day in a follow-up blog.

 

Live-streamed Preprint Journal Clubs!!!

Note: This post was written in collaboration with the PREreview team (Monica Granados, Samantha Hindle, Daniela Saderi)

We are teaming up with PREreview during Open Access week to bring together scientists from around the world to discuss and review an actual preprint…live-streamed!

PREreview helps scientists receive acknowledgement for their time spent reviewing others’ work. We are proud to collaborate with them to shine a spotlight on our shared goals: to encourage more scientists to post feedback on preprints, and to provide credit for their contributions to peer review. PREreview’s aim is to promote the discussion of preprints at journal clubs by providing resources and a platform where these discussions can be shared, and to ensure reviews are citable by assigning a digital object identifier (DOI) for each preprint review.

In the spirit of Open Access week, together we decided to leverage today’s technology to make these journal clubs more accessible to a diverse group of scientists by running them entirely online!

Want to participate? Read on for more details.

What is a preprint and how can it enhance your research?

Preprints are open access scientific manuscripts that have not yet been through editorial peer review. Because they are shared freely online while undergoing more formal peer review in a journal, they accelerate the communication of knowledge, thereby increasing the impact and reach of scientific discovery. Backed by many funders, journals, and institutions, preprints have become a legitimate part of research dissemination. To learn more about how preprints can help you and your science move forward, visit PLOS’ preprint resource page and ASAPbio.org.

One of the many advantages of preprinting is the potential for authors to receive immediate feedback from the rest of the scientific community and improve their article before formal publication in a journal. In a sense, it’s similar to the feedback one might receive after a talk or a poster presentation at a scientific meeting, except that it makes the process easier and more inclusive because anyone with internet connection around the world could comment on your work.

Here’s the plan – Join us and become part of the discussion!

WHAT

We will facilitate three interactive preprint journal club events that will be live-streamed via videoconference in which anyone can participate remotely. The focus will be on neuroscience, bioinformatics and ecology. Each online journal club will be hosted by two facilitators with experience in mediating video calls, as well as subject matter experts. They will guide participants through a constructive discussion of the preprint and collate feedback into a review report. These PREreviews will receive a DOI and will be shared online with the community as citable and discoverable objects. For more information about the format of PREreview online preprint journal clubs, check out PREreview’s information doc or email contact@prereview.org.

WHO

You! We encourage all interested researchers to be part of the discussion and offer their expertise. Of course, anyone who wants to see the power of preprints in action is also welcome. All you need is an internet connection or phone-in capabilities to join.

WHEN

Neuroscience: Monday, October 22, 9am PDT / 12pm EDT / 5pm GMT+1

Joining as a subject matter expert: Dr. Tim Mosca, Jefferson University

Bioinformatics: Tuesday, October 23, 9am PDT / 12pm EDT / 5pm GMT+1

Joining as a subject matter expert:   Dr. Shannon McWeeney and Dr. Ted Laderas,

Oregon Health and Science University

Ecology: Wednesday, October 24, 9am PDT / 12pm EDT / 5pm GMT+1

Joining as a subject matter expert:  Dr.Timothée Poisot,

Université de Montréal

HOW: REGISTER NOW to join us and invite others to come along. You can choose to join any (or all) of the subject areas listed above, and we will follow up with call-in details and specifics about the preprint that will be discussed closer to the event. Not sure if you can join yet? Sign up anyway to receive updates on the events and decide later.

WHERE: The video conference software Zoom, which is free to download. Please register and we will send you the information on how to join the calls.

Get ready to spend one hour with your fellow colleagues diving straight into the preprint and resurfacing with constructive feedback for the authors. This is a great opportunity to share your expertise with your peers from all over the world, learn about preprints, build your network, and get credit for your feedback.

Power to the Preprint: An Update

Earlier this year, we partnered with Cold Spring Harbor Laboratory to provide authors the opportunity to post their manuscripts as a preprint on bioRxiv through our submission system. In July, we started to implement a new feature linking PLOS articles to their matching bioRxiv preprint—regardless of whether they were submitted through the PLOS integration. This means any article published in a PLOS journal will be linked to its bioRxiv preprint, if posted, and adds on to the existing links from bioRxiv preprints to their journal publications.

What’s the benefit? For starters, it ensures that the early impact of the preprint is visible alongside the publication and enables the reader to learn the paper’s history. Forming these links shows more of the vital history of a research work, with a public life that started as a preprint, was shared online, and continued through peer review to journal publication. By linking papers with their preprints, we hope to make an important part of the paper’s life cycle accessible to our readers.

We actively support preprints as a vital part of the scientific literature. Preprints enable authors to get results out early, gain feedback on their manuscript from a wider community, accrue citations and time stamp their work. Preprints are indexed in Crossref and Google Scholar so they form a documented part of the ‘research story’ for the journal publications that may follow them. In the future we’ll work toward forming links with more preprint servers to ensure this feature is utilized across servers and scientific disciplines.

Our preprint service is free of charge and automated. All you have to do is opt-in when submitting your paper. The feature includes creation of a Preprint PDF from the author’s submission files (optional), screening checks and a seamless deposit to bioRxiv to make this process as easy as possible. Authors can focus on their journal submission, knowing their results will soon be available online as they work through the review process.

Note: PLOS Medicine continues to permit authors to post preprints of their research, but given the particular issues related to research in human health, does not currently offer transfer of submitted manuscripts to bioRxiv.

Opening up to new perspectives: an interview with Aileen Fyfe

Peer Review Week 2018 is all about diversity. To look at the changing scholarly landscape, we interviewed science publishing historian, Aileen Fyfe, who draws on her experience working in the archives of the Royal Society to explain how diversifying the pool of experts involved in the peer review process can further advance the scientific record.  

How diverse is the peer review process at the moment?

Until more people start publishing their data on both their community of authors and reviewers it’s difficult to tell. There’s certainly concerns about the geopolitical balance, whether the global north is doing much more of the refereeing, whereas the global south is doing a lot of authoring, and not so much refereeing. So that’s a large scale problem. There are certainly suspicions about women’s involvements, whether women are underrepresented as reviewers and editorial board members. The stats that we have so far do suggest that that’s true. But we still don’t know very much about other forms of diversity.

One of the points that I’d like to make about diversity in peer review is that there is always going to be some limit on how diverse peer review can be, because peer review by its nature doesn’t let just anyone do it. So we select as peer reviewers people who have got some form of shared expertise, some certain standards of training and education, and some form of research credentials, which nowadays generally means having published a couple of papers. So, there’s already a limit on exactly how diverse the pool can be, there are going to be some people who are not going to be invited to do this. And for peer review to work we wouldn’t want to lose that. But the interesting question becomes, within those limits, within the pool of people who have the appropriate level of education and professional research training are there any other reasons why diversity should be restricted. Nowadays we would probably say no to that, we wouldn’t think that religion or gender, geographical location or socioeconomic status should have anything to do with it. But those things have all often historically been tied up.

Over the years we’ve extended our understanding of who can be involved in peer review, but we don’t want to lose the sense of recognising what good science looks like. But, on the other other hand, there is room for a lot within that.

What happens when the peer review process isn’t diverse?

An immediate worry is that the process wouldn’t be fair if, for instance, decisions about global science were routinely being made by a handful of elite researchers in a handful of elite institutions. And we know that everyone carries implicit biases against people who are not like them—which is a particular worry in those fields of research that currently use single blind review systems.

But, you’ve also got the worry that with any group of similar minded people who share an understanding of what good science looks like, you’ve got a tendency to commit towards intellectual conservatism. Will they recognise the innovation, the speculative approach that turns out to be really fruitful? Because if you’re looking for stuff that is familiar and solid and rigorous and good, maybe you’ll miss something that’s exciting and novel, so there’s also that kind of aspect of diversity.

This intellectual diversity is slightly different from the social economic, gender, class, ethnicity that we talk about today. And nowadays we try to separate those things, and reckon hopefully we can have intellectual diversity and also have diversity of types of people.

We want to try and ensure that the people doing the editorial and reviewing work, aren’t being too prone to groupthink if you will, that they won’t be dismissing new approaches, new ideas, because they don’t look like the ones that they are familiar with, so there’s a worry that the scientific content could suffer if all the people doing the reviewing all comes from the same institution or from the same sub-field of a discipline.

What can we do to support diversity in the peer review process and scientific discourse in general?

One of the things that does seem to work is talking about it! When I look at the change in statistics for the Royal Society’s journals over the last 30 years, it’s quite clear that something has happened…, in terms of who is writing for the Royal Society and reviewing for the Royal Society, that has changed a lot since the 1980s early 1990s and I think it’s changed much more recently than that, and you have to ask why. And the only reason that I can see is that that organisation has started talking a lot about diversity, it has started gathering the statistics, and it’s started wondering what it can do better. And it looks as if that works. In terms of the number of editors, this is still rather male only, but the editorial boards, the publicly visible aspect, is getting better; and diversity among reviewers themselves is also getting better, it’s now similar to the number of authors in terms of gender. The proportion of women who are reviewers is similar to the proportion of women who are authors, which seems like a reasonable place to be.

So, talking about it, asking questions, getting people to publish their stats.  

I think that that raising awareness thing, which is becoming more common, does seem to be focusing people’s minds, even if it’s just focusing editors’ minds… on who are they going to choose. And that’s the first step towards making some sort of progress, so they don’t automatically go to whatever the equivalent of an old boys club is right now…. And this could be your institutional network, the people you know and went to graduate school with, it could be that kind of group, and ideally you want something more diverse than that. And if you got the editors thinking about that, then that’s got to be a good first step.  

About Aileen Fyfe

Aileen is a professor of history at the University of St. Andrews where she researches the history of scholarly communications. She currently leads a research project based at the Royal Society investigating the history of the world’s oldest scientific journal, Philosophical Transactions.

 

Featured image: Figure 1, PONE 0197280

 

It’s Peer Review Week!!!

Peer Review Week is a global event celebrating the essential role that peer review plays in how we evaluate and communicate most scholarly research. Our aim is to spark discussion and collaboration between researchers and publishers who want to improve scholarly communication. For that, there’s no better jumping off point than this year’s topic: diversity.

How diverse is it? The scholarly landscape

While the demand for STEM degrees is on the rise around the world, we still see a relatively small percentage of published papers out of countries in Africa and Latin America. Even more troubling, are the increasingly common stories of unsafe working conditions, lack of resources, and perceptual biases keeping women out of many higher academic fields.

A study published by eLife found the percentage of female authors, reviewers, and editors in STEM publishing falls far below 50% and predicts these percentages, 37%, 28% and 26% respectively, would not achieve parity with male counterparts until 2042 if currents trends of growth continue. In some fields, the difference in representation is even more staggering. For example, a review of 435 Mathematical journals published in PLOS ONE found the median number of editorships held by women to be only 7.6%. Fifty-one journals had no women on their editorial boards at all.

According to the eLife study, both women and men show an inclination to appoint reviewers of the same gender, limiting possibilities for women on a journal with an editorial board comprised of mostly men and perpetuating the imbalance of representation. Another study in PLOS ONE found similar results, as well as evidence of preference for other shared background characteristics such as country or institution.

If a journal’s peer review process is only as diverse as the network of editors who oversee it, we must diversify our editorial boards.

The role of publishers

Publishers, as stewards for scientific research, have a responsibility to raise all voices of the community. To do that, we must offer the tools and opportunities for researchers from all backgrounds to participate in the scholarly discussion.

The first thing we can do is to actively seek an even mix of expertise, affiliation, nationality, and gender to represent the editorial board and journal staff. We also need to start stepping outside our comfort zones, ask ourselves whether our tried and true models really work for everyone or if they have hidden barriers for certain groups. Some great advice for publishers and organizations is available here and here.

Starting the conversation

Peer review and how to make it better is the focus of many organizations that just last month penned an Open Letter to the community regarding transparent peer review. Through posted and signed reviews, we’re taking steps to further our understanding of peer review and to open up opportunities for critical conversations about peer review best practices.

Later this week, we’ll be posting an interview with Aileen Fyfe, science publishing historian, who will talk about the importance of unique voices to advance scientific literature.  The EveryONE blog also has an additional perspective, as well as a collection of papers on gender inequality in peer review.

In the meantime, we invite you to raise your own voice and share your experiences with the hashtags #PeerReviewWeek18 and #PeerRevDiversityInclusion.

Open Access Publishing Forges Ahead in Europe

A group of national funders, joined by the European Commission and the European Research Council, have announced plans to make Open Access publishing mandatory for recipients of their agencies’ research funding. Marc Schitlz, the President of Science Europe, has authored an article that outlines the path forward for their agencies. PLOS shares in the coalition’s dedication to disseminate scholarly work as rapidly and widely as possible. Because of its potential for impact on our communities, we’ve posted below an advance version of an article that will soon publish simultaneously in PLOS Biology, PLOS Medicine, and Frontiers in Neuroscience.

_______________________________________________________________________

Science Without Publication Paywalls: cOAlition S for the Realisation of Full and Immediate Open Access

Marc Schiltz1*

1 President, Science Europe, Brussels, Belgium

* Marc.Schiltz@fnr.lu

In this Perspective, a group of national funders, joined by the European Commission and the European Research Council, announce plans to make Open Access publishing mandatory for recipients of their agencies’ research funding.

Open Access is Foundational to the Scientific Enterprise

Universality is a fundamental principle of science (the term ‘science’ as used here includes the humanities): only results that can be discussed, challenged, and, where appropriate, tested and reproduced by others qualify as scientific. Science, as an institution of organised criticism, can therefore only function properly if research results are made openly available to the community so that they can be submitted to the test and scrutiny of other researchers. Furthermore, new research builds on established results from previous research. The chain, whereby new scientific discoveries are built on previously established results, can only work optimally if all research results are made openly available to the scientific community.

Publication paywalls are withholding a substantial amount of research results from a large fraction of the scientific community and from society as a whole. This constitutes an absolute anomaly, which hinders the scientific enterprise in its very foundations and hampers its uptake by society. Monetising the access to new and existing research results is profoundly at odds with the ethos of science [1]. There is no longer any justification for this state of affairs to prevail and the subscription-based model of scientific publishing, including its so-called ‘hybrid’ variants, should therefore be terminated. In the 21st century, science publishers should provide a service to help researchers disseminate their results. They may be paid fair value for the services they are providing, but no science should be locked behind paywalls!

A Decisive Step Towards the Realisation of Full Open Access Needs to be Taken Now

Researchers and research funders have a collective duty of care for the science system as a whole. The 2003 Berlin Declaration [2] was a strong manifestation of the science community (researchers and research funders united) to regain ownership of the rules governing the dissemination of scientific information. Science Europe established principles for the transition to Open Access in 2013 [3] but wider overall progress has been slow. In 2016, the EU Ministers of science and innovation, assembled in the Competitiveness Council, resolved that all European scientific publications should be immediately accessible by 2020.

As major public funders of research in Europe, we have a duty of care for the good functioning of the science system (of which we are part), as well as a fiduciary responsibility for the proper usage of the public funds that we are entrusted with. As university and library negotiation teams in several countries (e.g. Germany, France, Sweden) [4,5] are struggling to reach agreements with large publishing houses, we feel that a decisive move towards the realisation of Open Access and the complete elimination of publication paywalls in science should be taken now. The appointment of the Open Access Envoy by the European Commission has accelerated this process.

Hence, driven by our duty of care for the proper functioning of the science system, we have developed Plan S whereby research funders will mandate that access to research publications that are generated through research grants that they allocate, must be fully and immediately open and cannot be monetised in any way (Box 1).

 

Box 1. Plan S. Accelerating the transition to full and immediate Open Access to scientific publications.

 

The key principle is as follows:

 

“After 1 January 2020 scientific publications on the results from research funded by public grants provided by national and European research councils and funding bodies, must be published in compliant Open Access Journals or on compliant Open Access Platforms.”

 

In addition:

–                Authors retain copyright of their publication with no restrictions. All publications must be published under an open license, preferably the Creative Commons Attribution Licence CC BY. In all cases, the license applied should fulfil the requirements defined by the Berlin declaration;

–                The Funders will ensure jointly the establishment of robust criteria and requirements for the services that compliant high quality Open Access journals and Open Access platforms must provide;

–                In case such high quality Open Access journals or platforms do not yet exist, the Funders will in a coordinated way provide incentives to establish these and support them when appropriate; support will also be provided for Open Access infrastructures where necessary;

–                Where applicable, Open Access publication fees are covered by Funders or universities, not by individual researchers; it is acknowledged that all scientists should be able to publish their work Open Access even if their institutions have limited means;

–                When Open Access publication fees are applied, their funding is standardised and capped (across Europe);

–                Funders will ask universities, research organisations, and libraries to align their policies and strategies, notably to ensure transparency;

–                The above principles shall apply to all types of scholarly publications, but it is understood that the timeline to achieve Open Access for monographs and books may be longer than 1 January 2020;

–                The importance of open archives and repositories for hosting research outputs is acknowledged because of their long-term archiving function and their potential for editorial innovation;

–                The ‘hybrid’ model of publishing is not compliant with the above principles;

–                The Funders will monitor compliance and will sanction non-compliance.

Further Considerations

We recognise that researchers need to be given a maximum of freedom to choose the proper venue for publishing their results and that in some jurisdictions this freedom may be covered by a legal or constitutional protection. However, our collective duty of care is for the science system as a whole, and researchers must realise that they are doing a gross disservice to the institution of science if they continue to report their outcomes in publications that will be locked behind paywalls.

We also understand that researchers may be driven to do so by a misdirected reward system which puts emphasis on the wrong indicators (e.g. journal impact factor). We therefore commit to fundamentally revise the incentive and reward system of science, using the San Francisco Declaration on Research Assessment (DORA) [6] as a starting point.

The subscription-based model of scientific publishing emerged at a certain point in the history of science, when research papers needed extensive typesetting, layout design, printing, and when hardcopies of journals needed to be distributed throughout the world. While moving from print to digital, the publishing process still needs services, but the distribution channels have been completely transformed. There is no valid reason to maintain any kind of subscription-based business model for scientific publishing in the digital world, where Open Access dissemination is maximising the impact, visibility, and efficiency of the whole research process. Publishers should provide services that help scientists to review, edit, disseminate, and interlink their work and they may charge fair value for these services in a transparent way. The minimal standards for services expected from publishers are laid down on page 6 of the 2015 ‘Science Europe Principles on Open Access Publisher Services’ [3].

Obviously, our call for immediate Open Access is not compatible with any type of embargo period.

We acknowledge that ‘transformative’ type of agreements, where subscription fees are offset against publication fees, may contribute to accelerate the transition to full Open Access. Therefore, it is acceptable that, during a transition period that should be as short as possible, individual funders may continue to tolerate publications in ‘hybrid’ journals that are covered by such a ‘transformative’ type of agreement. There should be complete transparency in such agreements and their terms and conditions should be fully and publicly disclosed.

We are aware that there may be attempts to misuse the Open Access model of publishing by publishers that provide poor or non-existent editorial services (e.g. the so-called ‘predatory’ publishers). We will therefore support initiatives that establish robust quality criteria for Open Access publishing, such as the Directory of Open Access Journals (DOAJ) (https://doaj.org) and the Directory of Open Access Books (DOAB) (https://www.doabooks.org).

We note that for monographs and books the transition to Open Access may be longer than 1 January 2020, but as short as possible and respecting the targets already set by the individual research funders.

cOAlition S: Building an Alliance of Funders and Stakeholders

Plan S states the fundamental principles for future Open Access publishing. Science Europe, funders, the European Research Council and the European Commission will work together to clarify and publish implementation details. The plan does not advocate any particular Open Access business model, although it is clear that some of the current models are not compliant. We therefore invite publishers to switch to publication models that comply with these principles.

Plan S was initiated by the Open Access Envoy of the European Commission and further developed by the President of Science Europe and by a group of Heads of national funding organisations. It also drew on substantial input from the Scientific Council of the European Research Council.

Today, a group of national funders initiate the alliance cOAlition S (http://scieur.org/coalition-s) to take action towards the implementation of Plan S, and are joined by the European Commission and the European Research Council.

We invite other funding agencies and research councils, as well as stakeholders (notably researchers, universities, libraries, and publishers) to join cOAlition S and thereby contribute to the swift realisation of our vision of science without publication paywalls.

 

References

  1. Merton RK. The Normative Structure of Science. In: Merton RK. The Sociology of Science: Theoretical and Empirical Investigations. Chicago: University of Chicago Press; 1973.
  2. Max Planck Society. Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities. 22 Oct 2003. Available from: https://openaccess.mpg.de/Berlin-Declaration. Cited 29 Aug 2018.
  3. Science Europe. Science Europe Principles on Open Access to Research Publications. Apr 2013 (updated May 2015). Available from: http://scieur.org/opennew. Cited 29 Aug 2018.
  4. Kwon D. Universities in Germany and Sweden Lose Access to Elsevier Journals. The Scientist. 19 Jul 2018. Available from: https://www.the-scientist.com/news-opinion/universities-in-germany-and-sweden-lose-access-to-elsevier-journals–64522. Cited 31 Aug 2018.
  5. Kwon D. French Universities Cancel Subscriptions to Springer Journals. The Scientist. 31 Mar 2018. Available from: https://www.the-scientist.com/daily-news/french-universities-cancel-subscriptions-to-springer-journals-29882. Cited 31 Aug 2018.
  6. San Francisco Declaration on Research Assessment (DORA). DORA Roadmap: A two-year strategic plan for advancing global research assessment reform at the institutional, national, and funder level. 27 Jun 2018. Available from: https://sfdora.org/2018/06/27/dora-roadmap-a-two-year-strategic-plan-for-advancing-global-research-assessment-reform-at-the-institutional-national-and-funder-level/. Cited 29 Aug 2018.

 

 

Transparency, credit, and peer review

 

Yesterday I signed an open letter on behalf of all PLOS journals, alongside 20 other editors representing over 100 publications, to commit to offering transparent peer review options.

Support for publication of reviewer reports has been mounting as part of a greater effort to inform the discussion on peer review practice. Our joint commitment to transparent peer review comes on the heels of a meeting we attended earlier this year organized by HHMI, The Wellcome Trust and ASAPbio. Funders, editors, and publishers came together and agreed that elevating the visibility of peer review is paramount for informed scholarly discussion and early career development. Context for the initiatives is provided today in a Nature commentary.

We are excited to be working alongside so many other journals eager to bring posted reviews to our communities and to help change the way in which we talk about and understand peer review.

How it works

Our approach is to let authors and reviewers decide what level of transparency is right for them on a case by case basis. Authors will choose whether to make the peer review history public at the end of the assessment process for their manuscript. Reviewers will decide whether to reveal their identities or remain anonymous. We encourage authors and reviewers to experiment with the new options.

What’s in it for researchers?

Transparent peer review is a critical first step towards elevating peer review reports as recognized scholarly outputs.

We plan to post peer reviews with a DOI so that it can be cited in the contributor’s CV or referenced as the foundation for further discussion of the work. This is especially critical for early career researchers to be able to demonstrate their varied contributions to their field.

We hope that deeper insight into peer review will strengthen understanding of the scientific record and help future generations of researchers learn about the assessment process.

Ready for change

Before making our decision, we asked our communities what they thought of transparent peer review and surveyed the feedback from other journals that have already implemented or experimented with different forms of transparent review.

In a 2017 survey of our reviewers, 87% of participants said they would be fine with posted reviews. Of the remaining 13%, many indicated that they felt this decision should be left up to the authors, a concern that we’ve taken into account by allowing authors to decide whether they want to publish their peer reviews or not (according to another survey, 45% of them do).

Other journals who offer to post anonymous reviews, including Nature Communications, eLife, and The EMBO Journal, saw little to no difference in reviewer participation rates after implementing similar policies.

We’ll share what we learn

While transparent peer review isn’t a new concept, it hasn’t yet been widely adopted. With over 23,000 research articles published each year in the PLOS family of journals, there is an opportunity for us to affect meaningful change in the way scholarly communities in all disciplines learn about and understand peer review.

We are excited to offer transparent peer review to our contributors. As we move forward, we’ll be analyzing our processes, gathering data, and listening to feedback from our contributors in order to report back to the community.

 

PLOS Update

In 2009, we launched PLOS Currents as an experimental platform for rapid communication of non-standard publications. A few communities embraced the experiment enthusiastically from the start, and the contributions of researchers who volunteered as editors and reviewers was fantastic. Over the years, we have seen important applications, for example, in small communities collaborating on rare diseases research in PLOS Currents Huntington Disease, and in rapid communication of preliminary results in the context of disease outbreaks in PLOS Currents Outbreaks. In particular, there was a surge of submissions during the 2014 Ebola outbreak and the 2015-2016 Zika virus outbreak.

However, in recent years the technology supporting this platform has aged rapidly, the user experience has been subpar, and submissions have substantially decreased. We have undertaken a thorough review to understand these concerns, and to evaluate whether PLOS Currents was still meeting our original aims – and the needs of its communities. Our conclusion: it does not. Much has changed in the years since Currents’ launch and we think there are now better ways of serving the original aims. We have therefore made the difficult announcement to cease its publication.

From today, PLOS Currents will no longer accept new submissions. Authors who currently have a submission under review have been contacted with details on their options going forward. All PLOS Currents content will remain available, citable, indexed in PubMed and permanently archived on the PLOS Currents site and publicly archived in PubMed Central.

In assessing how PLOS Currents measured against its original vision we learned three major lessons and a new path forward:

  1. Despite the flexibility of the format and invitations to submit wide-ranging research (e.g., negative results, single experiments, research in progress, protocols, datasets, etc.), the majority of submissions to PLOS Currents were traditional research articles
  2. The platform which underlies PLOS Currents has not evolved as rapidly with the needs of the community it was meant to serve.
  3. A common thread has been the desire to publish rapidly, which was particularly obvious in the case of PLOS Currents Disasters and PLOS Currents Outbreaks. However, since PLOS Currents was launched new publishing tools have emerged that can facilitate the rapid sharing of work.

We have partnered with like-minded organizations to provide more adapted and specialized solutions. Today we offer the option for rapid dissemination across all of our journals through our recent partnership with the preprint server bioRxiv.  We have also partner with other platforms that specialize in specific content types such as protocols.io, an open access repository for laboratory protocols, with customized features for protocols publication, execution, adaptation, and discussion. Through a series of preferred repositories that host specialist dataset and our close collaboration with Dryad and figshare, PLOS champions the sharing of data sets according to the FAIR principles.

We continue to seek partnerships to facilitate the dissemination of research outputs that do not conform to the traditional research article mold. Meanwhile, PLOS ONE has reinforced its commitment to continue to publish negative results and replication and confirmation studies.

To bring together all these features, we have built PLOS Channels, which integrate content from all PLOS titles, the wider literature, preprint servers, blogs, and the other content platforms types described above. Channel Editors curate this content to create a highly-valuable community resource, developed and maintained by communities for communities. By extending beyond a single title or platform for original content, we believe that Channels are well suited to build on the initial objectives of PLOS Currents.

For example, last year, we launched a Disease Forecasting & Surveillance Channel to which one of the Editors of PLOS Currents Outbreaks and a member of our Currents review board already contribute. In May, as the Ebola outbreak in Democratic Republic of Congo worsened, we rapidly launched an Ebola Channel to serve responders. When the WHO announced the end of the outbreak, we paused activity on the Channel but stand ready to activate it, or another channel, as researchers and clinicians mobilize to fight outbreaks as they occur. These are only two examples of the potential we see for Channels to support specific communities.

The initial objectives of PLOS Currents remain vibrantly alive at PLOS and we are enormously grateful to all the PLOS Currents Editors and Reviewers, past and present, who have made this experiment possible. We will continue to work with these communities to find new ways to facilitate communication of research that fit their specific needs.

Sider