|
|
The primary mission of an OntoPortal installation is to host and serve ontologies and semantic artefacts. The portals accept resources in multiple knowledge representation languages: OWL, RDFS, SKOS, OBO and UMLS-RRF. Ontologies are semantically described with rich metadata (partially extracted from the source files), and a browsing user interface allows to quickly identify, with faceted search, the ontologies of interest based on their metadata.
The iSHARE Trust Framework provides a standardised approach for identification, authentication, and authorisation, enabling organisations to share data securely and efficiently. Key features include:
Federated and Decentralised Approach: Allows parties to join data spaces through trusted onboarding procedures without the need for pre-exchanged authentication keys or participant details.
Technical Specifications: Utilizes REST API architecture, OAuth 2.0, OpenID Connect 1.0, PKI, and digital certificates to ensure secure interactions between participants.
Role-Based Structure: Defines specific roles such as Service Consumer, Service Provider, Entitled Party, Identity Provider, Identity Broker, and Authorisation Registry, each with distinct responsibilities and functional requirements.
AgroPortal is an open, community-driven vocabulary service dedicated to the agricultural and agri-food domains, designed to host, share, and interconnect data models (aka semantic artefacts or knowledge organisation systems) such as ontologies, vocabularies, and thesauri about multiple aspects of agricultural data: technologies, breeding, food, plant phenotypes and traits, anatomy, etc. Built on the OntoPortal technology, it provides a comprehensive environment for ontology publishing, discovery, alignment, versioning, and semantic annotation, while ensuring interoperability with external data catalogues and platforms. By serving as both a technical infrastructure and a collaborative hub, AgroPortal supports researchers, data providers, and institutions in making their data FAIR—Findable, Accessible, Interoperable, and Reusable—thus fostering knowledge integration and innovation across agri-food.
The introduction of the Predictive Unit Real-Time Information Service (PURIS) enriches a company's resilience strategy through standardized data sharing, giving stakeholders heightened transparency and comprehensive information. This clarity allows PURIS users to detect supply chain issues earlier, initiate solution-finding more swiftly, and access a wider array of options, leading to more effective, cost-efficient, and environmentally friendly outcomes. By facilitating proactive anticipation, concurrent management, and reactive recovery, PURIS supports the supply chain across pre-, during-, and post-disruption phases, thereby improving operational efficiency and resilience within the Catena-X network.
A bridging service for publishing and accessing asset meta in the FDO (FAIR Digital Object) global data space data from within an EDC or AAS-based data space.
The Authorisation Registry is a key component of the iSHARE Trust Framework, enabling organisations to manage, verify, and delegate access rights within data-sharing ecosystems.
The Data Space Portal is a comprehensive platform that enables seamless interactions within data spaces, providing tools for data discovery and governance, while ensuring interoperability and adherence to data sovereignty principles for the data space members. The Crawler module of the Data Space Portal is designed to automatically discover, index, and update data resources across members Connectors. This component enhances the usability of data spaces by providing seamless and real-time insights into available data offers, supporting interoperability and data-sharing standards
Ikerlan Federated Learning Extensible KIT provides a solution designed to collaboratively improve AI models across multiple participants in a secure and privacy-preserving manner. Service providers use the KIT to publish a specific asset containing configuration files that deploy federated learning client components, which are automatically integrated with consumer’s EDC connector, enabling authorized participants to securely access federated learning service. Clients download these components, which establish secure gRPC-based data plane connecting clients to the provider's aggregation services. This allows participants to train models locally and request aggregated model updates on-demand.
This data app focuses on enabling privacy-preserving computations in data spaces. It leverages advanced Privacy-Enhancing Technologies (PETs), currently featuring Fully Homomorphic Encryption (FHE) and planned support for approaches like anonymization techniques and Zero-Knowledge Proofs (ZKPs). It is offered in the data space and delivered as a ready-to-deploy app to be instantiated in EDC connectors. It allows participants to process and compute encrypted data, preserving data privacy and enhancing data owners’ sovereignty over their data.
Apache Syncope is an Open Source system for managing digital identities in enterprise environments, implemented in Jakarta EE technology.
Syncope is a full-fledged IAM system covering provisioning, reconciliation and reporting needs, access management and API management.
The Sitra Rulebook model provides a manual for establishing a data space and to set out general terms and conditions for data sharing agreements. Rulebook Part 2 includes editable frameworks and templates including:
- Data Space Canvas
- Checklists: Business, Governance, Legal, and Technical
- Ethical maturity model
- Rolebook
- Servicebook
- General Terms and Conditions (to be used as-is)
- template for the Constitutive Agreement,
- template for the Accession Agreement,
- template for the Governance Model, and
- template for the Dataset Terms of Use
The Data Space Builder is a suite composed by the different data spaces components and technical building blocks such as catalogs, vocabulary services, trust framework & usage, policies and identity management, and data exchange including connectors and agents, also focused on semantic data management, data models management and NLP (Natural Language Process) intelligence.
The Dynamic Attribute Provisioning Service (DAPS) is a high-performance solution for secure and trustworthy communication within data spaces. It is based on the industry-standard Keycloak and provides a custom Dynamic Attribute Token (DAT) Mapper Extension, which allows for flexible and efficient dynamic provisioning of attributes. The DAPS ensures secure certificate-based authentication and authorization, validating connectors while supporting compliance with data sovereignty and security requirements.
The Business Model Radar provides a template to visually map key components of a business model in a circular and interconnected format. It helps co-operating organisations identify mutual strengths, gaps, and opportunities by providing a holistic view of how value is created, delivered, and captured.
The Participant Registry is a core component of the iSHARE Trust Framework, enabling organisations to verify and discover trusted participants within data spaces. It acts as a decentralised registry for maintaining participant information, including their identity, roles, compliance levels, onboarding details, etc.
WISEPHERE is a technological environment developed by ITI that, once deployed, allows organizations to manage, share and exploit data in a reliable and secure environment, with the aim of transforming this data into knowledge and value.WISEPHERE helps companies create Data Spaces and adopt data technologies, offering a response to their technological, legal and economic uncertainties, thus facilitating the path towards the data economy.
The sovity EDC Community Edition extends the Eclipse Dataspace Connector (EDC) with additional open-source enhancements, providing a ready-to-use solution for secure data exchange while ensuring data sovereignty.
Modular solution that, deployed in any organization, allows to establish a single point of entry for multiple data sources either proprietary in the role of the Data Provider or available throughout the Data Space in the role of Data Consumer ensuring the interoperability of shared data, trust between the parties involved in data exchange and data sovereignty
The Smart Data Models Initiative is a collaborative effort aimed at providing open and standardized data models to facilitate
interoperability and data sharing across various domains and applications, especially within the context of smart cities, digital twins, and IoT ecosystems. These data models define how data is structured and exchanged, ensuring consistency and compatibility among systems.
The FIWARE Data Space Framework FDF is an integrated suite of components implementing DSBA Technical Convergence recommendations, every organization participating in a data space should deploy to “connect” to a data space.
The TSG components allows you to participate in an IDS dataspace to exchange information with other organizations with data sovereignty in mind. You will be able to participate with the provided components as-is, but you’re allowed to modify the components to create your own dataspace with specific use cases in mind.
Systems need to use a common data model to communicate. Semantic Treehouse helps you and your community to agree on, define and improve these models
The Gaia-X registry is an open-source software with decentralisation at its core.
The Gaia-X registry is the backbone of the Gaia-X governance, and stores information, such as:
- the list of Gaia-X Trust Anchors;
- the result of the Trust Anchors validation processes;
- the potential revocation of Trust Anchors’s identity;
- the shapes and schemas for the Gaia-X Verifiable Credentials.
The Gaia-X registry is used by the Gaia-X Compliance Engine to perform the checks needed to assess Gaia-X Compliance and can be used by third parties to get correct information about the shapes, the Gaia-X Terms & Conditions, etc).
The source code of the Gaia-X registry can be reused in another public or private environment, agnostic from Gaia-X rules.
The service takes as input the W3C Verifiable Presentations provided by the participants, checks them against shapes using the the W3C SHACL format, available in the Gaia-X Registry.The service returns a W3C Verifiable Credential, the “Gaia-X Compliance Credential” with a Gaia-X signature, as a proof that the input provided has passed all the verifications.
IDSA complient certified IDS connector
The Ocean Enterprise Catalogue allows the distributed, tamper-proof, self-sovereign storage of Data, Services, and Offerings Descriptions. Metadata records are stored as signed Verifiable Credentials utilizing Ocean Enterprise smart contracts. The metadata is openly extensible to support domain-specific descriptions and standards, such as DCAT, Gaia-X, and others. As API and for performant queries against the distributed catalogue of any Ocean Dataspace the Aquarius Catalogue Cache Component, based on Elasticsearch, is utilized. Aquarius continuously monitors metadata being created or updated and caches the catalogue state for local processing supporting participant agents, markets and applications using the Data Space Infrastructure.
As a Data Space Participant Agent Nautilus for Ocean Enterprise provides Data Space Participants with the ability to publish, manage, discover, and consume data products and service offerings. It is a data economy toolkit and abstraction layer enabling programmatic interactions with the Ocean Enterprise Data Space Infrastructure and Components required by Participants.
The Ocean Enterprise Market or Ocean Enterprise Portal is a Graphical User Interface (GUI) which provides Data Space Participants with the ability to publish, manage, discover, and consume data products and service offerings. The Market allows Data Space Participants, especially Data Service Providers, to present target group specific information to potential Data Product Consumers.
The Ocean Enterprise Provider, alternatively named the “Connector” or “Access Controller” is a REST API specifically designed for the provisioning of data services. The access controller acts as an intermediary between the data source/data product provider and the user/data product consumer, thus preventing the need for the data product consumer to have direct access to the data product. Before granting access to a resource it performs a series of checks to verify the users permission to access a service, such as a data product contract opt-in, the identity of the data product consumer, successful payment, and access policies. The Ocean Enterprise Provider supports integrity checks, the transfer of data, the orchestration of Compute-to-Data, and the forwarding to service offerings to support “Everything as a Service”.