9.3 Active project management using the data management plan

Data management plans (DMP) are a key element of good data management. A DMP describes the data lifecycle, detailing how research data will be handled during and after the project. This includes specifying which data will be collected, processed, and/or generated, what methods and standards will be applied, the conditions under which the data can be used, and how the data will be curated and preserved, even after the project ends. It helps identify all project stakeholders and supports realistic calculations of the necessary resources in terms of personnel, time, expertise, and financial means.

Tips for working with a data management plan

Write a data management plan during the planning phase. It will be your most important planning document and a guide for the actual handling of data. Do this not only when your data is being produced and curated within a third-party funded project with a defined objective and time limit, but also when dealing with dynamic data, whose updating and continuation is an ongoing task. This is the case, for example, with collection data in cultural heritage institutions.

  • For concrete planning, start with the data lifecycle stage of subsequent use, as many priority decisions for the other stages depend on it.
  • Begin with a rough concept and gradually refine your DMP. Design all steps so that they are realistically achievable in the daily project routine.
  • Plan carefully which individuals and institutions you need to consult or involve to achieve certain goals. These might not belong to your project team or institution. Allocate resources for these communication processes as well.
  • Plan auxiliary work packages within the project carefully, as they may be more time-consuming than initially expected. This applies, for example, to assembling collection items before digitisation or to clarifying rights.
  • Keep the DMP up-to-date throughout the project. Adapt it to the actual circumstances and challenges so that it remains a realistic basis for planning during the project's progress.
  • Even if a DMP is not required by your funding institution or supervisor, you should create one and treat it as binding for your project.
  • Make the DMP openly accessible to all project team members and any other relevant participants, and regularly remind your team to consult and update it as needed.
  • Encourage your team to raise potential difficulties with DMP implementation early. This way, solutions can be sought in time before the problem becomes more serious.
  • Use a web-based software tool to create and maintain your DMP. It helps by providing a questionnaire structure and relevant accompanying information.

Example: Data management plan with a FAIR focus in the EU programme Horizon 2020, Annex 1

Further information on data management plans

Der Datenmanagementplan, in: Forschungsdaten.info, 2025-06-29

Web-based tool: The Research Data Management Organiser (RDMO) is available to members of institutions at many German universities. Depending on the institution, there are different templates for data management plans that can be customised for your own project. From July 2025, RDMO is also available as a service to NFDI4Culture participants. We recommend the elaborate DMP version of the DFG Checklist available there.

Tips for concrete project planning

  • Identify the target audiences that are most important to you.
  • Identify the object collections that are the focus of the project.
  • Identify the information that is most relevant for finding your data.
  • Take stock of:
    • all collections of the institution, as well as all databases, repositories, and systems in which the collections relevant to the project context are described and presented,
    • the existing legal frameworks, agreements, and contracts relevant to the rights management of the physical objects, their images, and metadata,
    • the current implementation status of the FAIR Principles.
  • What digital objects are identifiable in your data set? Determine which layers are considered ‘data’, which are ‘metadata’, and which units need to be addressed with a common identifier.
  • Based on the planned reuse contexts, decide the level of granularity at which identifiable digital objects should be addressable. In an archive, this could be a collection of hundreds of related documents. However, if they are individually catalogued and their corresponding digital copies are assigned, each could be addressed as a separate digital object.
  • Determine which metadata elements are required to evaluate your data against the criteria of the FAIR Principles. A specific checklist for the FAIRness of data, which can be integrated into a requirement catalogue for a FAIR development plan, is offered by the FAIR Data Maturity Model.
  • Not all necessary information will be contained in the project data itself. Clarify where to source the information needed to provide meaningful metadata, such as data provenance or accompanying documentation on metadata schemas or software.
  • Specify in the DMP the criteria suitable for your project to implement the FAIR Principles. Note any clarification needs and pending tasks.
  • It is likely that not all FAIRness criteria can be implemented at the same pace. Prioritise your efforts and scale them accordingly.
  • Create criteria catalogues for all stages of the data lifecycle. These should list the criteria that your data and the involved actors (individuals and infrastructures) must meet to enable various FAIR aspects. Also, document the consequences that certain actions will have in later phases of the data lifecycle. This makes interdependencies clearer, and the necessary tasks can be planned more effectively.
Example: Integration of authority data references into a dataset

A collection's data set is to be equipped with GND authority data references for the mentioned persons, subject keywords, and geographical locations. Large amounts of data can be efficiently handled using tools like OpenRefine in separate workflows as part of data curation. If numerous references need to be determined during ongoing cataloguing, it is worthwhile to integrate software features that help cataloguers quickly identify and link the correct authority record during their daily operations. For smaller quantities, a manual process based on human input may be the most effective, as it results in comparatively minimal changes elsewhere. The involved staff must have or acquire the relevant skills for data matching. Changes to the configuration of the database software or the metadata schema are necessary. The accuracy, consistency, and coverage of the entered references must be verified. Before publishing the data, ensure that the entries are correctly and completely transferred to the publication format (e.g. MARC, LIDO, EAD), and data transformation scripts may need to be adjusted.

Coordination with the operators of the publication platform may be necessary to determine whether and how the new authority data references support the platform's functionalities or even enable improved Linked Data features. It does not matter if the data you provide contains references to authority data that the platform is not currently processing. Platforms continuously work to improve the quality of their services and analyse incoming data for new potential uses.

Choose a trustworthy data platform

Make your data accessible through a trusted repository. Early on, assess various repositories to see if and how they support you in implementing the FAIR Principles. A certified repository provides a reliable storage location for datasets. Certification guarantees that the data is securely stored and will remain available, findable, and accessible in the long term. Examples of certification standards include CoreTrustSeal, the nestor Seal for Trustworthy Digital Archives und ISO 16363 certification. The certification level should be clearly indicated on the repository’s website. If the repository is not (yet) certified, it should provide clear statements on how it ensures the availability, accessibility, and reusability of the (meta-)data over a defined period. If information is missing or the implementation of the described conditions is unclear, ask for clarification. Also, inquire about any potential costs for storage and the associated conditions.

Align your data management with the repository's terms and conditions. By adhering to the repository's standards (preferred file formats, metadata schemas, etc.), you can ensure that all requirements for providing the data are met.

In addition to Online Public Access Catalogues (OPACs), the archive information system Arcinsys, and web databases managed by individual museums or networks, specialised portals or cultural heritage platforms such as the Deutsche Digitale Bibliothek (DDB) may be considered for publishing data from libraries, archives, and museums. DDB acts as an aggregator for Europeana. Both DDB and Europeana offer advisory support for local quality management, process data, and present it not only on their platforms but also via interfaces for further reuse.

Exemplary illustrations of the FAIR policy of repositories

Crosas, Mercè: The FAIR Guiding Principles: Implementation in Dataverse, 2019

Zenodo - Principles: Sections "FAIR Principles", "Plan S - compliance self-assessment", "Strongly recommended additional criteria for repositories"