Data Spaces Toolbox Submit a tool DSSC
Filter
20 matching tools
Validation and Verification

Apache Syncope is an Open Source system for managing digital identities in enterprise environments, implemented in Jakarta EE technology.

Syncope is a full-fledged IAM system covering provisioning, reconciliation and reporting needs, access management and API management.

Business and organisational support tools

The Sitra Rulebook model provides a manual for establishing a data space and to set out general terms and conditions for data sharing agreements. Rulebook Part 2 includes editable frameworks and templates including:

  • Data Space Canvas
  • Checklists: Business, Governance, Legal, and Technical
  • Ethical maturity model
  • Rolebook
  • Servicebook
  • General Terms and Conditions (to be used as-is)
  • template for the Constitutive Agreement,
  • template for the Accession Agreement,
  • template for the Governance Model, and
  • template for the Dataset Terms of Use
Value Creation

The Data Space Builder is a suite composed by the different data spaces components and technical building blocks such as catalogs, vocabulary services, trust framework & usage, policies and identity management, and data exchange including connectors and agents, also focused on semantic data management, data models management and NLP (Natural Language Process) intelligence.

Validation and Verification

The Dynamic Attribute Provisioning Service (DAPS) is a high-performance solution for secure and trustworthy communication within data spaces. It is based on the industry-standard Keycloak and provides a custom Dynamic Attribute Token (DAT) Mapper Extension, which allows for flexible and efficient dynamic provisioning of attributes. The DAPS ensures secure certificate-based authentication and authorization, validating connectors while supporting compliance with data sovereignty and security requirements.

Business and organisational support tools

The Business Model Radar provides a template to visually map key components of a business model in a circular and interconnected format. It helps co-operating organisations identify mutual strengths, gaps, and opportunities by providing a holistic view of how value is created, delivered, and captured.

Data Space Registry

The Participant Registry is a core component of the iSHARE Trust Framework, enabling organisations to verify and discover trusted participants within data spaces. It acts as a decentralised registry for maintaining participant information, including their identity, roles, compliance levels, onboarding details, etc.

Value Creation

WISEPHERE is a technological environment developed by ITI that, once deployed, allows organizations to manage, share and exploit data in a reliable and secure environment, with the aim of transforming this data into knowledge and value.WISEPHERE helps companies create Data Spaces and adopt data technologies, offering a response to their technological, legal and economic uncertainties, thus facilitating the path towards the data economy.

Participant Agent

The sovity EDC Community Edition extends the Eclipse Dataspace Connector (EDC) with additional open-source enhancements, providing a ready-to-use solution for secure data exchange while ensuring data sovereignty.

Participant Agent

Modular solution that, deployed in any organization, allows to establish a single point of entry for multiple data sources either proprietary in the role of the Data Provider or available throughout the Data Space in the role of Data Consumer ensuring the interoperability of shared data, trust between the parties involved in data exchange and data sovereignty

Vocabulary

The Smart Data Models Initiative is a collaborative effort aimed at providing open and standardized data models to facilitate
interoperability and data sharing across various domains and applications, especially within the context of smart cities, digital twins, and IoT ecosystems. These data models define how data is structured and exchanged, ensuring consistency and compatibility among systems.

Participant Agent

The FIWARE Data Space Framework FDF is an integrated suite of components implementing DSBA Technical Convergence recommendations, every organization participating in a data space should deploy to “connect” to a data space. 

Participant Agent

The TSG components allows you to participate in an IDS dataspace to exchange information with other organizations with data sovereignty in mind. You will be able to participate with the provided components as-is, but you’re allowed to modify the components to create your own dataspace with specific use cases in mind.

Vocabulary

Systems need to use a common data model to communicate. Semantic Treehouse helps you and your community to agree on, define and improve these models

Validation and Verification

The Gaia-X registry is an open-source software with decentralisation at its core.

The Gaia-X registry is the backbone of the Gaia-X governance, and stores information, such as:

  • the list of Gaia-X Trust Anchors;
  • the result of the Trust Anchors validation processes;
  • the potential revocation of Trust Anchors’s identity;
  • the shapes and schemas for the Gaia-X Verifiable Credentials.

The Gaia-X registry is used by the Gaia-X Compliance Engine to perform the checks needed to assess Gaia-X Compliance and can be used by third parties to get correct information about the shapes, the Gaia-X Terms & Conditions, etc).

The source code of the Gaia-X registry can be reused in another public or private environment, agnostic from Gaia-X rules.

Validation and Verification

The service takes as input the W3C Verifiable Presentations provided by the participants, checks them against shapes using the the W3C SHACL format, available in the Gaia-X Registry.The service returns a W3C Verifiable Credential, the “Gaia-X Compliance Credential” with a Gaia-X signature, as a proof that the input provided has passed all the verifications.

Participant Agent

IDSA complient certified IDS connector

Catalogue

The Ocean Enterprise Catalogue allows the distributed, tamper-proof, self-sovereign storage of Data, Services, and Offerings Descriptions. Metadata records are stored as signed Verifiable Credentials utilizing Ocean Enterprise smart contracts. The metadata is openly extensible to support domain-specific descriptions and standards, such as DCAT, Gaia-X, and others. As API and for performant queries against the distributed catalogue of any Ocean Dataspace the Aquarius Catalogue Cache Component, based on Elasticsearch, is utilized. Aquarius continuously monitors metadata being created or updated and caches the catalogue state for local processing supporting participant agents, markets and applications using the Data Space Infrastructure.

Participant Agent

As a Data Space Participant Agent Nautilus for Ocean Enterprise provides Data Space Participants with the ability to publish, manage, discover, and consume data products and service offerings. It is a data economy toolkit and abstraction layer enabling programmatic interactions with the Ocean Enterprise Data Space Infrastructure and Components required by Participants.

Value Creation

The Ocean Enterprise Market or Ocean Enterprise Portal is a Graphical User Interface (GUI) which provides Data Space Participants with the ability to publish, manage, discover, and consume data products and service offerings. The Market allows Data Space Participants, especially Data Service Providers, to present target group specific information to potential Data Product Consumers.

Participant Agent

The Ocean Enterprise Provider, alternatively named the “Connector” or “Access Controller” is a REST API specifically designed for the provisioning of data services. The access controller acts as an intermediary between the data source/data product provider and the user/data product consumer, thus preventing the need for the data product consumer to have direct access to the data product. Before granting access to a resource it performs a series of checks to verify the users permission to access a service, such as a data product contract opt-in, the identity of the data product consumer, successful payment, and access policies. The Ocean Enterprise Provider supports integrity checks, the transfer of data, the orchestration of Compute-to-Data, and the forwarding to service offerings to support “Everything as a Service”.

Toolbox
Apache Syncope
Sitra Rulebook model for a fair data economy
Data Space Builder
sovity Dynamic Attribute Provisioning Service (DAPS)
Business Model Radar
iSHARE Satellite (Participant Registry)
WISEPHERE
sovity EDC Community Edition (EDC CE)
Tekniker Dataspace Connector
Smart Data Models
FIWARE Data Space Framework (FDF)
TNO Security Gateway (TSG)
Semantic Treehouse
Gaia-X registry
Gaia-X Compliance Service
Data Space Innovation Lab Connector
Ocean Enterprise Catalogue and Aquarius Catalogue Cache
Nautilus Participant Agent
Ocean Enterprise Market
Ocean Enterprise Provider
Related resources are now highlighted.
Click them to learn more!
Service definitions DSSC

In our building blocks, we’ve identified the capabilities needed in a data space. Design choices need to be made for each capability: which data model will we use? Which policies apply? etc. We’ve also identified standards and specifications (if available) that we recommend data spaces adopt.

Building blocks are not translated 1:1 to software. Therefore, we’ve decided to introduce the term ‘services’: technical services that exist to implement the required capabilities. Software is needed to realise these services.

Value creation Observability Vocabulary Policy information point Validation and verification Catalogue Data space registry Federation Credential Store Data Plane Control Plane Participant agent Business and organisational support tools Data space service

Business and organisational support tools

Placeholder for tools that support the capabilities as specified by the business-, governance- and/or legal building blocks

Learn more

Participant Agent

Participant agent services allow a participant, as the name suggests, to participate in a data space. These services provide the basic functionality required by every participant in the data space. Such services play a vital role in ensuring trust in a data space as they differentiate between a data plane and a control plane. The control plane is key here as it implements functionalities for identification, publishing of data sets, etc. Within the Participant Agent, several parts can be identified.

Learn more

Participant Agent - Control Plane

It is important to distinguish between a control plane and a data plane:

  • The control plane is responsible for deciding how data is managed, routed and processed.
  • The data plane is responsible for the actual sharing of data.

For example, the control plane handles user identification, access, and usage policies, while the data plane handles the actual exchange of data.This implies that the control plane can be standardised to a high level, using common standards for identification, authentication, etc.

The data plane can be different for each data space and use case depending on the types of data exchange that take place. Some data spaces focus on sharing large datasets, others on message exchange, and others take an event-based approach. There is no one-size-fits-all, although some mechanisms (especially in the data interoperability pillar) can assist in making sure different data planes work together.

Learn more

Participant Agent - Data Plane

It is important to distinguish between a control plane and a data plane:

  • The control plane is responsible for deciding how data is managed, routed and processed.
  • The data plane is responsible for the actual sharing of data.

For example, the control plane handles user identification, access, and usage policies, while the data plane handles the actual exchange of data.This implies that the control plane can be standardised to a high level, using common standards for identification, authentication, etc.

The data plane can be different for each data space and use case depending on the types of data exchange that take place. Some data spaces focus on sharing large datasets, others on message exchange, and others take an event-based approach. There is no one-size-fits-all, although some mechanisms (especially in the data interoperability pillar) can assist in making sure different data planes work together.

Learn more

Participant Agent - Credential Store

The credential store is used to store credentials (identities and attestations) which have been issued by the validation and verification federation service. This could include credentials indicating that a participant is a member of a particular data space, for example.

The credential store is also used to present credentials to other participants in the data space and to validate credentials from others.

Relevant building blocks include Identity & Attestation Management and Trust Framework.

Learn more

Federation

Federation services support the interplay of participants in a data space. They operate according to the policies and rules specified in the Rulebook by the data space authority.

It is important to note that data spaces are usually distributed in nature. There is not necessarily a central platform where all data is kept. In most cases, participants in a data space manage their own data and can decide for themselves whether or not to share it with other participants, sometimes even in multiple data spaces.

That being said, there can still be the need for services which facilitate them in this interplay, e.g., federation services. There are six main categories of federation services: Data Space Registry, Validation and Verification services, Policy Information Point services, Catalogue services, Vocabulary services and Observability services

Learn more

Data Space Registry

This service is relatively new, and we expect it to mature in the future. The Data Space Registry can be seen as a ‘config file’ of your data space: a machine-readable interpretation of parts of the Rulebook. For example, this can indicate things like:

  • Which validation and verification services are available for onboarding of new participants (and validating existing ones)
  • Which catalogues are available
  • Which vocabulary services are available
  • etc.

Learn more

Catalogue

This service provides an overview of registered data products in the data space and links to their respective participant agents. This allows participants to search and find assets in the data space. The catalogue service implement the Publication and Discovery building block. Catalogues use the DCAT specification to express the metadata of Data Products.

Learn more

Validation and Verification

The building block on Identity and Attestations describes how credentials can be issued. The Trust Framework building block describes how it can be verified. The capabilities implemented by validation and verification services serve to:

  • Issue Credentials
  • Verify Credentials
  • Optionally: allow for delegation of trust (which, technically, is also the issuance of a credential).

Credentials can relate to all kinds of attestations:

  • Identity, which can be an eIDAS-compliant credential when available or another identity credential if needed.
  • Participation, which describes whether someone is a participant in the data space (i.e., has signed the relevant contracts or is compliant with certain regulations). This service implements the data space’s participants registry.
  • Other compliance: the credential indicates compliance with other rules, policies or regulations.

These services rely on the usage of W3C Verifiable Credentials and other related protocols.

Learn more

Policy Information Point (PIP)

It is sometimes necessary to provide policy information, allowing participants to make informed decisions. For instance, to evaluate whether someone can be granted access or not. Or whether a credential can be issued.

Several categories of Policy Information Point Services (PIP) can be distinguished:

  1. Identity provisioning: providing information on the identity of a person or asset.
  2. Personal Consent: indicating whether a person has given consent for sharing data. This is particularly important when consent is necessary based on privacy legislation (e.g. GDPR). The personal consent service can be deployed (as is the case for all federation services) in different ways. For example, there can be a single service for personal consent in the data space, but there can also be multiple services.
  3. Conformity Assessment: assessing whether someone is conforming to a specific policy. This could be relevant, e.g. when onboarding a new participant. A set of credentials might need to be supplied to assess conformity before a participant's credential can be issued.
  4. Data Usage Policy Issuing: issuing a policy (e.g., a standardised ODRL policy) that can be used by participants of the data space (as an alternative to a policy defined by the individual participant).

Such policy information points connect to policy decision points and policy execution points in the participant agent. More information can be found in Access and Usage Policies.

Learn more

Vocabulary

Vocabulary services provide an overview of available data models in the data space. This allows participants of the data space to choose common data models for a particular application. In the rulebook, some data models can be made mandatory to ensure semantic interoperability between participants.

Vocabulary Services can also link these data models to APIs/technical interfaces for data exchange, providing semantics and syntax.

Learn more

Observability

Depending on the use case and relevant legal/contractual obligations, it might be necessary to audit data sharing within the data space. In this case, it might be required to record specific data for the purposes of provenance & traceability.

Learn more

Value Creation

Value-creation services relate to additional services which reside in a data space. For value-creation services, it is not possible to define a limited list. This is because it is up to innovators in data spaces to determine which services are offered. Examples of these services could be a data marketplace, a data analytics service or a cross-data space AI service.

Participants of the dataspace can contract such services. They can be mandatory or optional, depending on what is specified in the Rulebook.

The Value-Creation Services building block provides more perspectives on deploying such services.

Learn more
Service definitions
Business and organisational support tools
Participant Agent
Control Plane
Data Plane
Credential Store
Federation
Data Space Registry
Catalogue
Validation and Verification
Policy Information Point
Vocabulary
Observability
Value Creation
Building blocks DSSC

The DSSC adopted the concept of building blocks to break data spaces into manageable smaller pieces. Building blocks are basic units or components that can be implemented and combined with other building blocks to achieve the functionality of a data space. We built on the 12 building blocks described in the Design Principles for Data Spaces | Position Paper of the EU-funded OpenDEI project and extended them to 17, that can be divided in 6 categories; ranging from business, governance, and legal to technical building blocks on data interoperability, trust, and data value. Moreover, we are not only adopting the design principles for data spaces but also continuing to improve this model based on the latest insights on data spaces.

Building blocks Business and organisational Business Business model Use case development Data product Intermediaries & operators Governance Organisational form & governance authority Participation management Legal Regulatory compliance Contractual framework Technical Data interoperability
 Data models Data exchange Provenance & traceability Data sovereignty & trust Identity & attestation management Trust framework Access & usage policies enforcement Data value creation enablers Data, services & offering descriptions Publication & discovery Value creation services

Business Model

A well-defined business model is the foundation of a sustainable data space. Unlike traditional business models, a data space requires collaboration between multiple actors, balancing economic viability, governance, and trust. Combining concepts from collaborative business models, multi-sided business models and governance approaches, this section provides a structured approach to designing and evolving business models that ensure long-term success.

Learn more

Use Case Development

The value of a data space stems from its use cases. Data space use cases are settings where two or more participants create business, societal or environmental value from data sharing. Use case development amplifies such value of a data space.

Use case development falls into three initial steps, and continuous improvement throughout the life cycle of the use case as the overarching process model (Figure 1):

  1. Identifying and monitoring use case scenarios is your starting point, where you generate new ideas.

  2. Refining use case scenarios is where you spend more of your time, giving detail to the use case so that you can test its viability. This includes, at the minimum, the purpose and value of the use case, the use case participants, and the necessary data flows.

  3. Implementing use cases is where you take the best ideas and move from the drawing board to putting the ideas into reality.

  4. Continuous improvement process is the overarching process throughout the life cycle of a use case where you analyze its performance, identify improvement opportunities, plan and implement changes.

Use case development is particularly important for data spaces in their early phases, as use cases attract users and participants that are essential for growth. An established data space with well-functioning use cases may opt out of use case development. However, it risks becoming obsolete as its business environment evolves and competitors develop new and improved use cases. Valuable use cases attract new customers and participants to data spaces, enabling them to scale and grow.

Learn more

Data Space Offering

The data space offering consists of the set of offerings available in a data space to participants. Offerings contain data product(s), service(s), and the offering description that provides all the information needed for a potential consumer to make a decision whether to consume the data product(s) and/or the service(s) or not.

This building block provides the data space initiatives with an understanding of the offerings from a business perspective. It proposes to develop and maintain a strategy for the data space offering. The elements of a data space offering strategy are the following:

  • Identification and onboarding of priority data products and services that serve existing and future use cases

  • Development, maintenance and enforcement of the governance rules of the data space offering

  • Supporting the participants to develop and offer high-quality data products

Learn more

Intermediaries and Operators

One of the core design topics for a data space is to consider how it uses service providers to provide necessary technical services as well as business and organisational services. While the data space governance authority (DSGA) and data space participants can provide these services by themselves, there are many business and governance reasons why procurement of services provides benefits. This building block elaborates on what kind of business, governance, legal, and contractual topics the DSGA should focus while procuring enabling services from intermediaries or an operator.

Within the ecosystem of service providers, intermediaries and operators form a distinct category characterised by their focus on providing enabling services. Data space can have one (operator) or multiple enabling service providers (usually intermediaries). Whether to design a data space with a single operator or multiple intermediaries is a data space business design question where the risks and benefits of a single provider versus multiple providers must be weighed against each other.

The effectiveness and utility of intermediaries and operators is ultimately at the balance between four different dimensions: (1) by their ability to streamline and make trusted data sharing easier and more economical, (2) to improve data space accessibility and usability for different participants and so (3) contribute to their scalability, and to (4) enable interoperability both within and between data spaces in order to create larger markets for different actors across data spaces and enable network effects to arise. Risks associated with procurement of services are often related to the vendor lock-in, additional costs associated with vendor management, and potential loss of sovereignty depending on the kind of provider selected.

Developing a data space’s organisational form, its governance authority, and governance framework and rulebook are governance design questions. Decisions for application of intermediaries and operators is an essential part of this data space governance design. This building block provides tools for DSGAs to create better governance design for their data spaces with or without service providers.

Learn more

Organisational Form and Governance Authority

This building block encompasses key decision-making points for the effective establishment and operation of a data space, namely the determination of an organisation's form and the establishment of a data space, the creation of a governance authority and the creation of a data space governance framework. Organisational form (or legal form) refers to the type of legal entity a data space may assume. A data space governance framework is a set of internal rules and policies applicable to all data space participants. A governance authority refers to bodies of a data space that are composed of and by data space participants responsible for developing and maintaining as well as operating and enforcing the internal rules.

The building block describes the options of creating a data space as an unincorporated (i.e. without legal personhood) and incorporated (i.e. as a legal person) entity and discusses most important consequences of these choices in a comparative way. The choice of legal of form of a data space has implications for the type and role of a governance authority, the ability of a data space to develop, implement and enforce its internal rules and, ultimately, data space’s overall development and sustainability.

The role of a governance authority may entail various functions, such as setting internal rules and policies, ensuring compliance with internal and external rules, and resolving conflicts that may arise. A governance authority also creates mechanisms for continuous improvement of the data space, identity management, access controls and risk mitigation to build trust and quality within the data space. Overall, the governance authority maintains and operationalises the internal rules for the successful operation of the data space.

Determining the organisational form and establishing the governance authority should be completed before the data space enters the operational phase. At least in the most essential parts, a data space governance framework should be created in parallel to support the functioning of the new data space. The organisational form, type of governance authority and its role may evolve over time due to the scaling up of the data space or the assumption of new functions.

Learn more

Participation Management

The Participation Management building block outlines governance processes for managing participant engagement in data spaces. This includes identifying participants, onboarding, offboarding, and setting rules for data transactions and service provision. It addresses risks like data governance challenges and reduced collaboration. This building block provides guidelines for efficient and secure participation by integrating relationships with other governance aspects like regulatory compliance and identity management. The building block shall provide guidance for Data Space Participants for the implementation of internal Data Governance addressing Data Space specific concerns.

Learn more

Regulatory Compliance

This building block aims to guide the data space governance authority in applying legal rules to a data space's design and operation. Specifically, it helps to properly define some participant roles and responsibilities, establish internal policies, and continuously monitor the regulatory compliance of a data space. In addition, it assists data space participants in understanding their rights and obligations under regulatory frameworks that are relevant to their role in a data space or to a specific data transaction. It also provides guidance on relevant legislation to those interested in setting up or joining a data space, including developers, policymakers and others.

Key elements of this building block include:

  • Triggers: Elements, criteria or events (e.g. data type, nature of participant or domain) that have occurred in a particular context of a data space and signals that a specific legal framework must or should be applied.

  • Data space requirements: Regulatory provisions that explicitly refer to data spaces.

  • Additional legal considerations: This element highlights other important legal considerations to be aware of when setting up or operating a data space, e.g. cybersecurity law. 

  • Tools enabling regulatory compliance within a data space: Technical tools or techniques designed to address certain legal requirements (such as secure processing environment, privacy-enhancing technologies etc.).

  • Regulatory Compliance Flowcharts: A step-by-step guidance helping to assess the applicability of a specific legal framework and to determine the requirements to be addressed by specific entities. The main objective of this element is to operationalise the triggers and structure the interplay of the above-mentioned elements. In the future, these flowcharts will become part of the Legal Compass which will reflect more in detail on the relationship between decisions taken in the business, technical or governance of a data space and compliance with particular legal requirements.

Learn more

Contractual Framework

The Contractual Framework building block describes the legally enforceable agreements underlying the operation of a data space by different parties entering into a relationship with the data space.

There are three categories based on the subject matter of the agreement:

  • Institutional agreements

  • Data-sharing agreements

  • Services agreements

These agreements differ in terms of the parties involved, their function, and the elements covered by the agreement.

Institutional agreements implement the governance of a data space and are an essential component of the Rulebook. They not only provide the general terms and conditions for participation in a data space but also underpin its existence and provide a legal basis for its operations. Data-sharing agreements provide the legal basis for the data transactions happening in a data space among data space participants. Services agreements refer to all agreements for the provision of services to data spaces.

These identified categories offer working concepts under which various agreements are placed, such as data product contracts. The present building block provides a selection of the most important agreements. These agreements are described in terms of their functionalities, their main elements are presented, and examples are provided. The building block also lists the most common legal issues to consider regarding the Contractual Framework, pointing to existing resources in the Further Reading section to address some of these issues.

Learn more

Data Models

Data models ensure that data is accurately and consistently interpreted when exchanged within a data space. The data model consists of metadata that provides information about semantics, helping to interpret the actual exchanged data. Such models are relevant in a data space where a data provider offers data products and data consumers want to utilise and exchange these data products. When using the same data model, semantic interoperability becomes possible and data can be exchanged among the data space participants.

Data spaces should consider shared data models, or ‘semantic standards’. These models serve as dictionaries that enable data providers and data consumers to “speak the same language” when exchanging data. Considering that participants have diverse perspectives and requirements about the meaning of data, it is essential to develop, reuse, and govern these shared data models within the data space. This is a continuous balancing act between the need for strict uniformity to keep data consistent and easy to understand, and the need to accommodate the fact that different organizations have different requirements for their data. In a data space, the governance framework should include these agreements to ensure wide consensus regarding the data models used in the data space.

Data models are located in a common repository, known as a Vocabulary Service. The data product should refer to a data model, which is in Vocabulary service. This allows both the data provider and data consumer to refer to the repository during data exchange, ensuring accurate exchange and interpretation of the data. However, this rises a challenge for federated data spaces, as each data space develops their unique data models. The first step in (re)using data models from other data space is to find and access them. Therefore, data spaces should be able to exchange their data models in a standardized manner to establish agreements on their usage.

A data model is a structured representation of data elements and relationships used to facilitate semantic interoperability within and across domains. However, there are different abstraction levels for data models. This building block distinguishes between the various types of data models and the meta-standards in which they can be expressed while also providing examples. In addition to that this building block describes how these data models can be implemented, reused, governed, by whom, and what tools can assist in this process.

Learn more

Data Exchange

The scope of this building block is the actual transmission of the data between participants of a data space:

The scope of the building blocks is to establish agreed-upon mechanisms between participants for the actual transmission of data. To achieve this, data spaces must make a strategic choice.

Note that ‘transmission’ can encompass many different types of data exchange (data sharing, messaging, streaming, algorithm-to-data, etc.).

The data exchange process involves a Transfer Process (TP), which progresses through a series of states. These are basic states, REQUESTED, STARTED, COMPLETED, SUSPENDED, TERMINATED as defined by the participants as a minimum, but their final number and complexity may vary depending on the implementation. The process has to ensure that data exchange is managed systematically, with clear transitions between states based on messages exchanged between the provider and consumer.

Learn more

Provenance & Traceability

Some use cases require additional metadata alongside the shared data for auditing and compliance purposes. Depending on the specific scenario, it may be necessary to track transactions occurring within the data space or identify who has accessed certain data.

The need for observability, traceability, and provenance tracking is particularly common in highly regulated industries or when managing high-value data.

It is essential to differentiate between two phases: the control phase, which involves transactions related to data-sharing contracts, and the actual data-sharing phase, where data is exchanged. Observability refers to the ability to monitor and manage data-sharing contracts, while data provenance tracking focuses on monitoring the sharing and usage of the actual data.

Both aspects fall within the scope of this framework and may be subject to regulatory or contractual compliance. Regardless, ensuring observability and provenance tracking is the responsibility of each participant and requires the implementation of robust data governance processes by all Data Space participants.

This building block offers guidance for supporting observability, provenance, traceability, logging, audits, and related processes in a standardized manner. Additionally, it addresses the collection, storage, and processing of these types of data.

Learn more

Identity & Attestation Management

This building block outlines the guidelines and technical mechanisms necessary for managing identities and other attestations within a data space. It focuses on enabling participants to present, verify, store, and exchange attestations in a secure, reliable, and self-sovereign manner. Identity and attestation management is foundational to onboarding participants, verifying their compliance with the Data Space Rulebook, and issuing proofs of membership that facilitate trusted data exchanges.

The primary objectives of this building block are:

  • To establish standardized methods for collecting evidence and ensuring ongoing compliance with data space rules.

  • To provide detailed examples of attestations, such as membership credentials, and explain how they can be validated within the ecosystem.

  • To detail technical standards and protocols that underpin security and data sovereignty in identity and attestation management processes, for example with regard to credential exchange.

These objectives ensure that all participants can engage in the data space with confidence in the integrity and trustworthiness of identity-related processes.

Learn more

Trust Framework

This building block defines the trust-specific elements of a trust framework within a data space. Trust is essential because the most critical processes - such as verifying participant identities, validating attestations, and ensuring data is managed in accordance with the data space rulebook - depend on it. A robust trust framework not only strengthens security but also accelerates trust decisions and facilitates data exchanges, thereby supporting the growth of the data space.

The objectives of this building block are:

  • To articulate the core components and principles of a trust framework tailored to data spaces.

  • To define the roles and responsibilities of entities that establish and maintain trust (e.g., trust anchors, trust service providers, trusted data sources, notaries).

  • To explain how existing trust frameworks can be utilized or extended to meet the data space’s needs.

Learn more

Access & Usage Policies Enforcement

This building block establishes core policies that govern data management in Data Spaces:

Access Control Policies

Determines who can access data and under what conditions.

  • Who can access data: Defining conditions for access based on roles, attributes, or other criteria.

  • How data access is granted and controlled: Defining policy-based frameworks for determining who receives access permissions, under what conditions access is allowed, and how authorization decisions are enforced.
    Example: A healthcare provider can access the data usage policy only if they are a registered healthcare professional and have explicit authorization from the patient.

Usage Control Policies

Specifies what actions can be performed and which obligations are provided according to the policy once access is granted.

  • What actions can (not) be performed on data: Specifying permissible operations, such as analysis, modification, sharing, or deletion.

  • How usage is controlled: Setting rules to enforce the boundaries of allowed actions, ensuring compliance with the policy.
    Example: A researcher can access patient data for analysis but cannot modify, share, or delete it without additional permissions.

Consent Management Policies

Manages consent and permissions for data usage, particularly when the data holder differs from the data subject. Determine and verifies authorized consent providers (data subjects or representatives). Establishes explicit consent processes including opt-in and opt-out mechanisms. Manages consent verification and revocation workflows and bridges relationships between data rights holders and data subjects.

  • Example: A data-sharing agreement between companies requires explicit consent from data subjects before sharing personal data.

Note: All policy (Access Control, Usage Control, and Consent Management) depend on policy engines that follow deterministic algorithms to calculate whether actions should be granted or denied. For the same input conditions, a policy engine will always produce the same output decision.

Learn more

Data, Services and Offerings Descriptions

A cornerstone of any data space is the precise and comprehensive description of offerings. These offering descriptions are created using machine-readable metadata, making them accessible to both humans and software systems, thus facilitating seamless interaction and automation. They encompass metadata for various elements, including data products, services, data licenses, usage terms, and additional details such as commercial terms and pricing, all systematically organised within a catalog. High-quality metadata plays a critical role in ensuring the discoverability, interoperability, and usability of data products and services, forming the foundation for an efficient data sharing ecosystem.

Learn more

Publication and Discovery

The purpose of the Publication and Discovery building block is to provision and discover offerings within a data space. The formal descriptions of these offerings are explained in more detail in Data, Services, and Offerings Descriptions. Offerings are typically created by providers of data and services and are stored within a catalogue, where the provider is responsible for managing their lifecycle, from the moment they are published until they are removed. After publication, consumers can query the available offerings in the catalogue and find (i.e. discover) the best match.

In summary, the offerings are:

  • Created by providers to showcase their data and services;

  • Typically stored within a catalogue;

  • Used by consumers to identify offerings that best match their needs.

Based on the above, the objectives of the Publication and Discovery building block are:

  • For data and service providers:

    • Expose the metadata of data and services as offerings, ensuring they are visible to all (or a subset of) data space participants as potential consumers;

    • Manage offerings in accordance with their lifecycle;

    • Manage access to offerings;

  • For data and service consumers:

    • To identify the best-matching offerings, that is, to search (i.e., query) offerings to ascertain whether their characteristics, terms, and conditions align with their business and technical needs and requirements.

    • To request access to these offerings.

The Publication and Discovery building block is linked to article 33 of the European Data Act ('Essential requirements regarding interoperability of data, of data sharing mechanisms and services, as well as of common European data spaces').

Learn more

Value Creation Services

This building block addresses the technical aspects related to those services that are aimed to create value, by different means, out of the data shared in the data space. These services, called Value Creation Services, complement the Federation Services and Participant Agent Services, to compose the whole set of services available in a data space (see Services for Implementing Technical Building Blocks ).

Notice that business aspects of these services are considered in the Data Space Offering building block, while information about providers of these services is provided in the Intermediaries and Operators building block.

This building block has the following objectives:

  • To define a taxonomy for Value Creation Services, based on their role and purpose within the overall data space, and then focused on their specific functionality within it. This taxonomy ensures that services effectively cover a wide range of requirements coming from other building blocks or components of the data space, data-driven applications and initiatives, and facilitate the discovery and use by data space participants.

  • To provide an information model of a Value Creation Service, aimed to support the description, specification and implementation of these services.

  • To propose service composition of atomic services as the way to create ad-hoc value in the data space, responding to the needs of use cases

  • To provide guidelines to define and specify the technical infrastructure and required components needed to support the proper delivery of these services, and to ensure their correct management, performance, scalability, monitoring and maintenance.

Learn more
Building blocks
Business Model
Use Case Development
Data Space Offering
Intermediaries and Operators
Organisational Form and Governance Authority
Participation Management
Regulatory Compliance
Contractual framework
Data Models
Data Exchange
Provenance & Traceability
Identity & Attestation Management
Trust Framework
Access & Usage Policies Enforcement
Data, Services and Offerings Descriptions
Publication and Discovery
Value Creation Services
Co-creation DSSC

Developing a data space is complex—balancing business and organisational aspects, as well as technical aspects while ensuring sustainability. The Co-Creation Method provides a structured, step-by-step approach to help data space participants navigate these challenges, ensuring informed decision-making and efficient collaboration. The Co-Creation Method will guide data space initiatives through the different stages of the development cycle, which is relevant for the evolution of the data space.

Exploratory stage Preparatory stage Implementation stage Establish Data Space Agreements and Policies Functional Analysis and Data Space Design Establish Organisational Form Develop Use Cases and Identify Functional Requirements Align Stakeholders on the Data Space Scope

Align Stakeholders on the Data Space Scope

The purpose of this process is to create a 'coalition of the willing' among different stakeholders who want to analyse the extent to which they are prepared to share data within or across data spaces. Successful data spaces require strong stakeholder alignment from the start. This process ensures all involved parties share a common understanding of the data space’s purpose, scope, and governance principles, thereby reducing misalignment and accelerating decision-making.

What is the result of the process?

  • The involved stakeholders have a basic idea of the purpose of the data space, why they might want to engage with it, and the principles with which they want to start;

  • The stakeholders will have to decide on the scope of the data space on a general level;

  • They will also decide on the context in which they will collaborate and the type of use cases that will be covered;

  • They will determine whether the data space should be for-profit or non-profit.

Who should take action regarding the result?

  • One or more parties should be appointed to orchestrate the process of setting up the data space (not just this development process but all five development processes).

  • All parties that have an initial interest in setting up the data space.

What can the actors do with the result?

  • This allows stakeholders to decide whether to continue engaging with the data space initiative. This is the first go/no-go decision.

Learn more

Develop Use Cases and Identify Functional Requirements

What is the result of the process:

  • A defined set of use cases which have been described in detail, based on the high-level use cases (step 2 in Align Stakeholders on the Data Space Scope process), and align with the data space’s purpose.

  • All relevant participants are identified, as this is a prerequisite for establishing the organisational form.

These use cases will create the first value-generating activities of the data space.

Who should take action regarding the result:

  • The results are relevant for the data space participants, who will receive a clear definition for each use case on the following topics:

    • the individual business model

    • the collaborative business model

    • a business case for each participant

    • a business case for the data space as a whole

  • A list of functional requirements derived from the use cases will also be provided.

What can the actors do with the result:

  • Each actor can now decide whether to continue their involvement in the development of the data space.

  • Each actor now has clarity regarding how they wish to be involved in the data space.

  • The process orchestrator is now able to make technical, legal, and business agreements to meet the functional requirements set for each use case.

It is an absolute necessity for the participants in the data space to generate enough value with the use cases for the data space to be viable. The functional requirements offer the initial indication of the costs necessary to establish the data space and maintain its operation (which serves as input for the business case).

Learn more

Establish Organisational Form

What is the result of the process:

  • Formalisation in collaboration, by defining the organisational form.

  • The parameters of the governance framework, along with the roles and responsibilities in the data space, become clear.

Who should take action regarding the result:

  • The establishing parties that have been (self-)determined to fill and enforce the rulebook.

What can the actors do with the result:

  • Participants can start joining the data space.

  • The establishing parties have an organisational form through which further agreements can be made.

Learn more

Functional Analysis and Data Space Design

Now that most of the business and organisational decisions have been made, the data space needs to be designed and built.

What is the result of the process?

  • Translates the functional requirements into a design for the data space, detailing the necessary building blocks, standards, and services.

  • The design should include technical and organizational components, as well as agreements about governance and policies.

Who should take action regarding the result?

  • The data space authority is responsible for documenting the decisions made, such as those regarding standards, roles, responsibilities, and other specifications, in the rulebook.

What can the actors do with the result?

  • Service providers can start building and connecting their services according to the specifications of the data space.

  • Participants can start preparing their offerings and their connection to the data space.

Learn more

Establish Data Space Agreements and Policies

What is the result of the process?

  • The completion of the development processes (at least the first iteration) by documenting the policies and agreements needed to run the data space.

  • Policies, coming from both business and organisational as well as technical building blocks, will populate the rulebook. These policies are meant to govern the internal data space processes.

  • Agreements are made with third parties, such as other data spaces, enabling or value-added services, or utilities like water and electricity providers.

When exchanging or buying and selling data, agreements are also made. However, these agreements are usually between data space members (i.e. bilateral) and, therefore, outside of the scope of this development process.

Who should take action regarding the result?

  • The data space authority has their rulebook finished and filled in.

  • The data space authority has its contractual framework in place.

  • The data space participants can provide their offerings.

What can the actors do with the results?

  • All participants and stakeholders in the data space can follow and act according to the rules defined when the data space was set up.

  • The first iteration of the data space definition is finished once this process is done.

Of course, once this documentation process is finished, the work on the data space does not stop. Just like any organisation and/or business, there is a need for continuous improvement and adaptation. It is, however, beneficial to have a formal end to an iteration of the data space development processes, allowing a new iteration to start with fresh goals for improvement.

Learn more
Co-creation
Align Stakeholders on the Data Space Scope
Develop Use Cases and Identify Functional Requirements
Establish Organisational Form
Functional Analysis and Data Space Design
Establish Data Space Agreements and Policies