- Print
- PDF
Data Source - TROCCO
- Print
- PDF
summary
Help page for ETL Configuration of Data Settings held in TROCCO.
Data Source TROCCO can transfer any of the following data
- Data Catalog
- Metadata held in the Data Catalog can be transferred.
- ETL Configuration, Data Mart Configuration, Workflow Definition
- Historical data on previously executed ETL Jobs can be ETLed.
constraints
In order to transfer data in the Data Catalog, the Data Catalog must be activated.
Data Catalog is a paid option.
To request a trial or to sign up, please contact your sales representative or Customer Success.
ETL Job can transfer historical data of jobs executed up to one year in the past, starting from the date and time when the job execution history is transferred.
Historical data for jobs executed before that time cannot be ETL Job.
Setting items
STEP1 Basic settings
item name | indispensable | Contents |
---|---|---|
subject (of taxation, etc.) | Yes | Select the target to be transferred from the following |
data type | Yes | Select the data type to be transferred from the following
|
When ETL Configuration, Data Mart Configuration, or Workflow Configuration is selected
item name | indispensable | Contents |
---|---|---|
Data acquisition period | Yes | Enter the start and end dates and times of the data you wish to transfer. Please enter in YYYYY-MM-DD or YYYYY-MM-DD HH:mm:ss format.If HH:mm:ss is not specified, 00:00:00 is specified instead. |
time zone | Yes | Select the time zone for the data acquisition period. |
The data acquisition period is based on the execution date/time ( created_at)
of each job.
Therefore, the historical data of ETL Job whose created_at
value is within the data acquisition period is subject to ETL Job.
transfer item
Below is a list of items transferred for each target and each data type.
Data Catalog
user defined metadata
There are two types of data types that can be transferred
- User-defined metadata (BigQuery table information):.
- Transfers table information metadata.
- User-defined metadata (BigQuery Column Setting information)
- Transfers column information metadata.
Only metadata for table columns for which at least one of the basic or user-defined metadata has already been entered will be transferred.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
bigquery_project_id | string | Metadata projects related to BigQuery |
bigquery_dataset | string | Dataset of metadata about BigQuery |
bigquery_table | string | BigQuery table name |
bigquery_column | string | BigQuery column name Only user-defined metadata (BigQuery Column Setting information) will be transferred if selected as the data type. |
trocco_metadata_logical_name | string | Logical name of basic metadata |
trocco_metadata_description | string | Basic Metadata Description |
trocco_metadata_last_updated_at | timestamp | Last update date and time of basic metadata |
(Fields defined in the template for user-defined metadata ) | (see below for details) | User-defined metadata values |
user_defined_metadata_last_updated_at | timestamp | Last modified date and time of user-defined metadata |
About Data Types
The data types defined in the template are converted to the following typesString
->string
Text(Markdown)
->string
Integer
->long
Boolean
->boolean
About column names
- If non-alphanumeric characters are used in the field names defined in the Template for User-Defined Metadata, some Data Destination Connectors may not be able to forward the data.
- In the above case, change the column name to one consisting only of alphanumeric characters in Column Setting in ETL Configuration STEP 2.
column reference list
Forward the column reference list.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
bigquery_src_project_id | string | Metadata projects related to BigQuery (references) |
bigquery_src_dataset | string | Dataset of metadata about BigQuery (references) |
bigquery_src_table | string | BigQuery table name (reference source) |
bigquery_src_column | string | BigQuery column name (reference source) |
bigquery_dst_project_id | string | Metadata projects related to BigQuery (references) |
bigquery_dst_dataset | string | Dataset of metadata about BigQuery (references) |
bigquery_dst_table | string | BigQuery table names (references) |
bigquery_dst_column | string | BigQuery column names (references) |
creation_type | string | How to create a column referencetrocco_data_source user |
created_by | string | Email address of the column reference creator |
last_updated_at | timestamp | Last modified date and time of the column reference |
ETL Configuration
job execution history
ETL Job transfers historical data about previously executed ETL Jobs.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
job_id | long | ETL Job ID |
job_url | string | ETL Job URL |
job_definition_id | long | ETL Job Source ETL Configuration ID |
job_definition_url | string | ETL Job Source ETL Configuration URL |
job_definition_name | string | Name of Data Source ETL Configuration from which the Job Source was created. |
executor_type | string | How to execute an ETL Jobmanual:Manual executionScheduler : Scheduled executionworkflow : Executed as a workflow taskapi : Execution by APIjob_dependency : Execution by trigger job |
status | string | Status of ETL Jobqueued : waiting for executionsetting_up : preparing to runexecuting : Executinginterrupting : execution is interruptedSucceeded : Execution completed (success)error : Execution completed (error)canceled : Execution completed (canceled)skipped : execution completed (skipped) |
transfer_records | long | Number of records transferred |
transfer_bytes | long | null |
skipped_records | long | null |
started_at | timestamp | null |
finished_at | timestamp | null |
created_at | timestamp | Date and time of execution of ETL Job ( %Y-%m-%d %H:%M:%S %Z) |
Data Mart Configuration
job execution history
Transfers historical data on previously executed data mart jobs.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
datamart_job_id | long | DATAMART Job ID |
datamart_job_url | string | Data Mart Job URL |
datamart_definition_id | long | Data Mart Configuration ID from which the job originated |
datamart_definition_url | string | Data Mart Configuration URL from which the job was created |
datamart_definition_name | string | Name of the Data Mart Configuration from which the job originated. |
executor_type | string | How to run a data mart jobmanual:Manual executionScheduler : Scheduled executionworkflow : Executed as a workflow taskjob_dependency : Execution by trigger job |
status | string | Data Mart Job Statusqueued : waiting for executionsetting_up : preparing to runexecuting : Executinginterrupting : execution is interruptedSucceeded : Execution completed (success)error : Execution completed (error)canceled : Execution completed (canceled)skipped : execution completed (skipped) |
transfer_records | long | Number of records transferred |
started_at | timestamp | null |
finished_at | timestamp | null |
created_at | timestamp | Date and time of execution of the data mart job ( %Y-%m-%d %H:%M:%S %Z) |
Workflow
job execution history
Transfers historical data about previously executed Workflow Jobs.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
pipeline_job_item_id | long | Unique ID associated with Workflow Job ID |
pipeline_job_id | long | Workflow Job ID |
pipeline_job_url | string | Workflow Job URL |
pipeline_definition_id | long | Workflow Job's original Workflow Job ID |
pipeline_definition_url | string | Workflow Job's original Workflow Job URL |
pipeline_definition_name | string | The name of the Workflow Job's original Workflow definition |
executor_type | string | How to Perform a Workflow Jobmanual:Manual executionScheduler : Scheduled executionworkflow : Executed as a workflow taskapi : Execution by APIretry : Retry execution (reexecution from the stop position or reexecution by retry setting) |
canceled_type | string | null |
status | string | Workflow Job Statusqueued : waiting for executionsetting_up : preparing to runexecuting : Executinginterrupting : execution is interruptedSucceeded : Execution completed (success)error : Execution completed (error)canceled : Execution completed (canceled)skipped : execution completed (skipped)retry_waiting : Waiting state from execution completion (error) state to start retry (only if retry count is set) |
started_at | timestamp | null |
finished_at | timestamp | null |
created_at | timestamp | Workflow Job execution date and time ( %Y-%m-%d %H:%M:%S %Z) |