The Flexible ETL Load Processor operates on a special type of dataset, namely a pre-defined ETL (Extract, Transform, Load) source. The ETL source can be re-used within this processor and therefore sensible information (e.g., user name and password) must not be repeatedly inserted. It loads the requested data for use in the workflow.
NOTE THAT: This sections enlists mostly exemplary usage of the overall processor. For more detailed information about technical features, the article about processors with loading functionalities supports an entry for the Flexible ETL Load Processor.
The processor does not have an input port.
- Data Location: Instructions on how to create a Connection and a Data Table from that Connection can be found here.
- Table Name or custom SQL: Be aware, that the remote table name is not necessarily identical to the ONE DATA Data Table name.
The loaded Data Table can be used as input in further processors.
The Data Table "departmentsTable" was created from an ETL connection. The connection table is named "departments".
In the case the entire Data Table needs to be returned, instead of a specific SQL SELECT statement, typing the table name is sufficient.
When attempting to load a database table, be careful that the correct schema is active. Your current schema is shown under Connections. All available Connections accessible with this schema are listed.
It is not necessary to specifically mention the schema in the "Table name or custom sql" configuration element when loading an Oracle table available under the current schema.
In the unusual case that a table not available under your current schema needs to be accessed, adding "nameOfOtherSchema." in front of the Oracle table name will allow loading of the table.
The database user needs to have access to this other schema. Access to schemas can not be configured using ONE DATA. This needs to be done directly within the configured target database.