Dr Alexandra Freeman, Winton Centre for Risk & Evidence Communication, Cambridge
Octopus: a radical new approach to scientific publishing
A platform for publication of 'smaller units of publication' that will accommodate open science
I believe that almost all the major issues with the current research culture in science (including widespread questionable research & communication practices, publication bias, lack of replication, poor methodology, inequalities in access, data hoarding, slow progress, wasted resources) are driven by one thing: the current scientific publishing model.
The fact that the current incentive structure in science is driven almost entirely by the number of publications a researcher has (and the 'impact factor' of the commercial journals those are published in), means that the incentives for researchers are actually those of 'journalists', not scientists.
I believe that a radically different approach to scientific publishing is key to changing the current research culture to one where good scientific practices are recognised and can be valued. I imagine an entirely online platform ('Octopus') with the following key features:
Breaking up the old-fashioned academic 'paper'. The unit of publication will be smaller, and be published whenever researchers complete them: a formulated 'problem'; hypothesis; method/protocol; data; analysis; discussion/interpretation; real-world relevance/uses. Each is linked to the stage above (which may have been published by another author) to form vertical chains of research (which can of course also be horizontally linked to related publications). This will allow researchers to publish more quickly, more independently, and get credit for their individual contributions. Methods/protocols being published independently from results will radically improve the issues of publication bias, reproducibility, and increase the amount of data published overall (as any small dataset can be published). Funders can also choose to fund well-reviewed protocols (even many different groups to carry out the same protocol).
Statistical analyses being published separately from data will allow greater professionalisation of this vital aspect (as researchers can publish their data and let specialised statisticians publish an analysis). The fact that data publication is an integral part of the chain will formalise and incentivise open data sharing (the 'data' part of the chain will be hosted in existing data repositories, with Octopus hosting a link and summary metadata). The linkage of all these publications into chains and them all in one place will make 'literature searching' and meta-analysis quick and easy.
Using automatic language translation, the platform will be entirely language-agnostic. Researchers can read and write in whatever language suits them best. This will make science much more open and accessible to all, regardless of background.
Each of these (mini) publications will be publishable instantly, to be both reviewed and rated by all (a form of post-publication open peer review). This will speed up research, and solve some of the problems of peer review. At the moment, the term 'peer reviewed' gives false reassurance of quality. With fully open reviewing, everyone can see valid and constructive criticisms or praise (and from whom), sharing the community's expertise and experience.
Reviews will be treated as a publication in their own right, and can also be rated. This will recognise and value the importance of constructive collaboration.
Ratings (1-5 star) will be on specific features. E.g. for a hypothesis: "Original and important contribution", "Clear, understandable and contains all necessary information to be verified/falsified", "Scientifically/logically valid". Choice of these features to be rated is crucial - they specify what is valued in a scientific contribution. Getting these right will set the new incentive structure for researchers and drive the new culture.
Readers will also be able to flag publications for issues such as potential misrepresentation of facts, inappropriate statistics, or potential plagiarism. This will create a self-policing community. Poor practice can be called out publicly and any genuine mistakes can be quickly corrected without anyone being misled.
Using ORCID as a log-in, all a user's contributions - publications, reviews or ratings - will be recorded. Each individual's home page on the platform will provide a detailed set of metrics about their activity. Are they a highly rated hypothesis-creator? Meticulous data-gatherer? Highly appreciated reviewer?. It will also be obvious if anyone consistently overly-favours their friends with ratings (like Eurovision!). This is vital. Not only will the lack of anonymous contributions minimise the chances of bad behaviour, but the personalised metrics will allow true meritocracy. These pages will be incredibly useful for institutions wanting to assess a researcher's contributions - and once these pages are being used for that purpose, it will again drive researchers to maximise their performance in these metrics. So by choosing carefully what these metrics illustrate, the platform can again shape what researchers prioritise in their work: good constructive reviews, and well-rated publications.
In order to drive uptake, I propose to use AI to pull in all the old-fashioned papers already openly published and link them into the Octopus system. Making Octopus initially the place to go to find research first and foremost will drive take-up, and the ability for researchers to rate and review these will enhance that archive. It will then become the place to add new contributions.
The whole platform will be designed open-source and as a community-owned self-sustaining project. Just like Wikipedia, developers will be able to add features and improve the code. There will be no paid editors or intermediaries - the science community will police itself (with buy-in from institutions, since if serious misconduct is alleged, an individual's institution will be automatically notified etc).
All of this is relatively technically easy to achieve (and can incorporate many aspects already worked on by others). There are of course cultural and commercial barriers to uptake, but I believe that producing a system that works well and is of genuine and obvious benefit to the scientific community is stage 1. So many people agree that this would be 'a good thing' that I believe that it will, if built, find traction (and I am willing to do the ambassadorial work necessary to drive funders and institutions to recognise it - and that in turn will drive researchers to use it).
Existing journals do not need to fear annihilation. Only a small proportion of what is published in traditional journals is original research. People will still need research aggregation (what is new and relevant to me today/this week?) and expert editorials (How do the latest findings in this field relate to me?). Bringing research findings to the people who need to know them was the original remit of journals, and this is what they should return to. There will still be a market for that. It is just primary research that would far better be done and published in a community-owned non-commercial platform.
For me, this concept is so important, so fundamental to science and to society - worldwide - that I cannot think of anything more important I can do with my life. I am prepared to do whatever it takes to make this work, and make this change. I am not pitching this for the award of £1000, I am pitching it in order to be able to talk about it to the conference in October, and discuss it with people whose opinions I'd value enormously.