AI Summary • Published on Apr 20, 2026
Testing of cyber‑physical energy systems currently relies on informal, narrative documentation and heterogeneous data formats, which hampers reproducibility, traceability, and FAIR‑compliant data sharing. Without machine‑readable semantics for test objectives, system configurations, and workflow provenance, reproducing experiments across laboratories is error‑prone and labor‑intensive.
The authors propose an ontology‑driven dataspace that combines three complementary viewpoints: (1) a Holistic Test Description ontology (HTD‑O) for experimental goals and criteria, (2) a System Configuration Model ontology (SCM‑O) that graph‑represents multi‑domain system configurations, and (3) an Open Provenance Model for Workflow (OPMW) to capture workflow templates and execution traces. Two cross‑laboratory case studies (a physical PV‑inverter test and a digital‑twin workflow) demonstrate how metadata can be extracted, encoded in RDF/OWL, and stored in a shared triplestore.
The case studies show that the proposed ontologies can encode test specifications, device configurations, and provenance information sufficiently to reproduce the original experiments. metadata files were generated for the OpenSVP dataset, and provenance graphs linked measurements to test steps. The replication of voltage and power set‑point sequences across two labs highlighted gaps in existing documentation but confirmed that the ontology framework enables systematic annotation and discovery.
By providing a structured, machine‑actionable representation of test metadata, the approach supports FAIR principles, facilitates AI‑ready data annotation, and paves the way for automated reproducibility checks and cross‑institutional collaboration. Future work will extend the ontologies, integrate toolchains for automated annotation, and validate the framework in larger federated testing environments.