Exporting data out of a research infrastructure entails retrieving data from the data curation subsystem and delivering it to an external resource. This process must be brokered by the data use and data publishing subsystems.
|
Notation of Computational Viewpoint Models#notation_cv_objects |
Generally requests for data to be exported to an external resource originate from a CV Presentation Objects#virtual_laboratory. All requests are validated by the CV Service Objects#aaai_service via its authorise action interface. The laboratory provides an interface to an external resource (this might take the form of a URI and a preferred data transfer protocol) and submits a request to a data broker in the data publishing subsystem via its data request interface. The data broker will translate any valid requests into actions; in this scenario, a data transfer request is sent to the CV Service Objects#data_transfer_service within the data curation subsystem.
The data transfer service will configure and deploy a CV Component Objects#data_exporter; this exporter will retrieve data from all necessary data stores, opening a data-flow from data store to external resource. The exporter is also responsible for the repackaging of exported datasets where necessary – this includes the integration of any additional metadata or provenance information stored separately within the infrastructure that needs to be packaged with a dataset if it is to be used independently of the infrastructure. As such, the exporter can invoke the CV Service Objects#catalogue_service to retrieve additional meta-information via its export metadata interface.