Data management plans (DMP) are a key element of good data management. A DMP describes the data lifecycle, detailing how research data will be handled during and after the project. This includes specifying which data will be collected, processed, and/or generated, what methods and standards will be applied, the conditions under which the data can be used, and how the data will be curated and preserved, even after the project ends. It helps identify all project stakeholders and supports realistic calculations of the necessary resources in terms of personnel, time, expertise, and financial means.
Write a data management plan during the planning phase. It will be your most important planning document and a guide for the actual handling of data. Do this not only when your data is being produced and curated within a third-party funded project with a defined objective and time limit, but also when dealing with dynamic data, whose updating and continuation is an ongoing task. This is the case, for example, with collection data in cultural heritage institutions.
Example: Data management plan with a FAIR focus in the EU programme Horizon 2020, Annex 1
Der Datenmanagementplan, in: Forschungsdaten.info, 2025-06-29
Web-based tool: The Research Data Management Organiser (RDMO) is available to members of institutions at many German universities. Depending on the institution, there are different templates for data management plans that can be customised for your own project. From July 2025, RDMO is also available as a service to NFDI4Culture participants. We recommend the elaborate DMP version of the DFG Checklist available there.
A collection's data set is to be equipped with GND authority data references for the mentioned persons, subject keywords, and geographical locations. Large amounts of data can be efficiently handled using tools like OpenRefine in separate workflows as part of data curation. If numerous references need to be determined during ongoing cataloguing, it is worthwhile to integrate software features that help cataloguers quickly identify and link the correct authority record during their daily operations. For smaller quantities, a manual process based on human input may be the most effective, as it results in comparatively minimal changes elsewhere. The involved staff must have or acquire the relevant skills for data matching. Changes to the configuration of the database software or the metadata schema are necessary. The accuracy, consistency, and coverage of the entered references must be verified. Before publishing the data, ensure that the entries are correctly and completely transferred to the publication format (e.g. MARC, LIDO, EAD), and data transformation scripts may need to be adjusted.
Coordination with the operators of the publication platform may be necessary to determine whether and how the new authority data references support the platform's functionalities or even enable improved Linked Data features. It does not matter if the data you provide contains references to authority data that the platform is not currently processing. Platforms continuously work to improve the quality of their services and analyse incoming data for new potential uses.
Make your data accessible through a trusted repository. Early on, assess various repositories to see if and how they support you in implementing the FAIR Principles. A certified repository provides a reliable storage location for datasets. Certification guarantees that the data is securely stored and will remain available, findable, and accessible in the long term. Examples of certification standards include CoreTrustSeal, the nestor Seal for Trustworthy Digital Archives und ISO 16363 certification. The certification level should be clearly indicated on the repository’s website. If the repository is not (yet) certified, it should provide clear statements on how it ensures the availability, accessibility, and reusability of the (meta-)data over a defined period. If information is missing or the implementation of the described conditions is unclear, ask for clarification. Also, inquire about any potential costs for storage and the associated conditions.
Align your data management with the repository's terms and conditions. By adhering to the repository's standards (preferred file formats, metadata schemas, etc.), you can ensure that all requirements for providing the data are met.
In addition to Online Public Access Catalogues (OPACs), the archive information system Arcinsys, and web databases managed by individual museums or networks, specialised portals or cultural heritage platforms such as the Deutsche Digitale Bibliothek (DDB) may be considered for publishing data from libraries, archives, and museums. DDB acts as an aggregator for Europeana. Both DDB and Europeana offer advisory support for local quality management, process data, and present it not only on their platforms but also via interfaces for further reuse.
Crosas, Mercè: The FAIR Guiding Principles: Implementation in Dataverse, 2019
Zenodo - Principles: Sections "FAIR Principles", "Plan S - compliance self-assessment", "Strongly recommended additional criteria for repositories"