Data source as ADAF
Import datasource as ADAF.
For instructions on how to add or write custom plugins, see Plugins.
If input datasource is a URL resource, the node will download it to a temporary file before importing it.
If the URL resource contains credential variables, these will be entered as part of the URL.
See Credentials Preferences for more info.
- Importer (active_importer)
Select data format importer
- Action on import failure (fail_strategy)
Decide how failure to import a file should be handled.
ADAF is an internal data type in Sympathy for Data. In the ADAF different kind of data, metadata (data about data), results (aggregated/calculated data) and timeseries (accumulated time-resolved data), connected to a simultaneous event can be stored together with defined connections to each other.
The different kinds of data are separated into containers. For the metadata and the results, the containers consist of a set of signals stored as a Table.
For the time-resolved data, the container has a more advanced structure. The time-resolved data from a measurement can have been collected from different measurement system and the data can, because of different reason, not be stored together. For example, the two systems do not using the same sample rate or do not have a common absolute zero time. The timeseries container in the ADAF can therefore include one or many system containers. Even within a measurement system, data can have been measured with different sample rates, therefore the system container can consist of one or many rasters. Each raster consists of a time base and a set of corresponding signals, which all are stored as the internal Table type.
This node uses plugins. Each supported file format has its own plugin. The plugins have their own configurations which are reached by choosing among the importers in the configuration GUI. The documentation for each plugin is obtained by clicking at listed file formats below.
The node has an auto configuration which uses a validity check in the plugins to detect and choose the proper plugin for the considered datasource. When the node is executed in the auto mode the default settings for the plugins will be used.