Data released on October 27, 2016
Data analysis software and canonical datasets are the driving force behind many fields of empirical sciences. Despite being of paramount importance, those resources are most often not adequately cited. Although some can consider this a “social” problem, its roots are technical: Users of those resources often are simply not aware of the underlying computational libraries and methods they have been using in their research projects. This in-turn fosters inefficient practices that encourage the development of new projects, instead of contributing to existing established ones. Some projects (e.g. FSL) facilitate citation of the utilized methods, but such efforts are not uniform, and the output is rarely in commonly used citation formats (e.g. BibTeX). DueCredit is a simple framework to embed information about publications or other references within the original code or dataset descriptors. References are automatically reported to the user whenever a given functionality or dataset is being used.
DueCredit is currently available for Python, but we envision extending support to other frameworks (e.g., Matlab, R). Until DueCredit gets adopted natively by the projects, it provides the functionality to “inject” references for 3rd party modules.
For the developer, DueCredit implements a decorator @due.dcite that allows to link a method or class to a set of references that can be specified through a doi or BibTeX entry.
The initial release of DueCredit (0.1.0) was implemented during the OHBM 2015 hackathon and uploaded to pypi and is freely available. DueCredit provides a concise API to associate a publication reference with any given module or function. DueCredit comes with a simple demo code, which demonstrates its utility.
DueCredit is in its early stages of development, but two days of team development at the OHBM hackathon were sufficient to establish a usable prototype implementation. Since then, the code-base was further improved and multiple beta-releases followed, expanding the coverage of citable resources (e.g., within scipy, sklearn modules via injections and PyMVPA natively).
Craddock, R. C., Bellec, P., Margules, D. S., Nichols, B. N., Pfannmöller, J. P., Badhwar, A., … Cipollini, B. (2016). 2015 Brainhack Proceedings. GigaScience, 5(S1), 1–26. doi:10.1186/s13742-016-0147-0