Skip to content

Latest commit

 

History

History
6 lines (3 loc) · 2.35 KB

README.md

File metadata and controls

6 lines (3 loc) · 2.35 KB

Redaptor

Redaptor is a platform consisting of 1) a database of useful methods, data, and tools extracted from publications and 2) an evaluation system to streamline and incentivize rigorous scientific reporting practices.

Evaluation of scientific impact through citation counts and journal prestige does not motivate researchers to invest in openness and thoroughness. Scientists agree that research should be collegiate, transparent, and dynamic in order to foster continuation, yet hiring institutions and funding agencies consistently reward novelty and impact above rigorous scientific reporting, partly because they lack tools to quantitatively assess this vital aspect of the research enterprise. Numerous platforms exist for facilitating open access to methods, tools, data, and metadata. However, identifying reusable resources from the growing body of research literature, understanding their provenance, and unifying them to extend findings is not trivial. This in turn makes scoring and rewarding reusable contributions difficult. The central objective of the Redaptor platform is to automatically acquire reusable elements, such as methods, tools, data, and metadata, directly from published text, and house them in a searchable, centralized database. We will overcome time- and resource-intensive manual curation using a combination of automated natural language processing and crowd-sourced curation as our first core functionality. The second core functionality, the reuse-index, will dynamically characterize the influence of an article by ranking its associated reusable elements. The reuse-index will improve as authors update tools, methods, and datasets on our platform. By providing a mechanism for viewing, assessing, and updating publications specifically with their reuse in mind, our platform facilitates a positive feedback loop using the “living paper” model of publishing. This approach additionally incentivizes researchers to post evidence of replication, negative results, and extensions of published work. The evaluation of these efforts through the reuse-index will empower funding bodies and mobilize the research community, including research intuitions and journals, to enforce reporting standards in response. As an outcome of this project we expect to improve the quality of scientific reporting while decreasing the cost associated with lack of transparency.