D4.1 Draft Recommendations on Requirements for Fair Datasets in Certified Repositories
Creators
- 1. University of Bremen (UniHB)
- 2. Digital Curation Centre (DCC)
Contributors
- 1. Data Archiving and Networked Services (KNAW-DANS)
- 2. UK Data Archive
- 3. Digital Curation Centre (DCC)
- 4. University of Bremen (UniHB)
Description
The overall goal of FAIRsFAIR is to accelerate the realization of the goals of the European Open Science Cloud (EOSC) by compiling and disseminating all knowledge, expertise, guidelines, implementations, new trajectories, training and education on FAIR matters. FAIRsFAIR work package 4 (WP4) will support the provision of practical solutions for implementing the FAIR principles through the co-development and implementation of certification schemes for trusted data repositories enabling FAIR research data in the EOSC, and the provision of organizational support and outreach activities.
One of the objectives of WP4 is to develop requirements (e.g., metrics) and tools to pilot the FAIR assessment of digital objects, in particular research data objects in trustworthy digital repositories (TDRs). This report presents the first results of work carried out towards achieving the objective. We outline the context for our activities by summarising related work both performed in other work packages within FAIRsFAIR and approaches from the wider community to address FAIR data assessment. We introduce a range of scenarios for assessing data objects for FAIRness before or after deposit in data repositories and outline two primary use cases that we want to focus on in the project:
-
A trustworthy data repository will offer a manual self-assessment tool to educate and raise awareness of researchers on making their data FAIR before depositing the data into the repository, and
-
A trustworthy data repository committed to FAIR data provision wants to programmatically assess datasets for their level of FAIRness over time. To facilitate this, FAIRsFAIR will develop an automated assessment for published datasets that will be piloted with some of the repositories selected for in-depth collaboration as part of the FAIRsFAIR open calls.
In addition, we present a set of preliminary metrics corresponding to FAIR principles that can be used to assess data objects through manual and automated testing. We discuss the development and key aspects of the metrics, including their initial alignments with the existing CoreTrustSeal requirements. The alignment forms a basis to develop the FAIR elaboration of CoreTrustSeal requirements, which is one of main ongoing activities of WP4. Furthermore, we present draft requirements that any FAIR assessment implementation will need to consider and highlight how those requirements will impact the use cases for FAIR assessment that our upcoming work will address. We conclude by outlining the next steps in our work to iteratively improve the requirements through a number of pilots. Our priorities include the refinement of the suggested metrics based on the feedback elicited during pilot testing with several communities, in the context of the use cases developed.
Files
D4.1_Draft_Recommendations_on_Requirements_for_FAIR_Datasets_in_Certified_Repositories_v1.0.pdf
Files
(1.4 MB)
Name | Size | Download all |
---|---|---|
md5:9ef39e0938c854501ed8ffdbc536eda2
|
1.4 MB | Preview Download |