- 17 Jul 2024
- Print
- PDF
Transfer source - TROCCO
- Updated on 17 Jul 2024
- Print
- PDF
summary
Help page for settings to transfer data held in TROCCO.
The transferring source TROCCO can transfer any of the following data
- data catalog
- Metadata held in data catalogs can be transferred.
- Transfer settings, data mart definition, workflow definition
- Historical data on previously executed jobs can be transferred.
constraints
Transferring data from the data catalog requires activation of the data catalog.
データカタログは有償オプションとなっております。
トライアルを希望される場合や、ご契約を希望される場合は、営業担当者またはカスタマーサクセスまでお問い合わせください。
The historical data of jobs executed up to one year in the past can be transferred, starting from the date and time when the job execution history transfer is executed.
Historical data for jobs executed before that time cannot be transferred.
Setting items
STEP1 Basic settings
item name | indispensable | Contents |
---|---|---|
subject (of taxation, etc.) | Yes | Select the target to be transferred from the following |
data type | Yes | Select the data type to be transferred from the following
|
When transfer settings, data mart definitions, and workflow definitions are selected
item name | indispensable | Contents |
---|---|---|
Data acquisition period | Yes | Enter the start and end date and time of the data you wish to transfer. Please enter in YYYYY-MM-DD or YYYYY-MM-DD HH:mm:ss format.If HH:mm:ss is not specified, 00:00:00 is specified instead. |
time zone | Yes | Select the time zone for the data acquisition period. |
The data acquisition period is based on the execution date/time ( created_at
) of each job.
Therefore, the historical data of jobs whose created_at
values are within the data acquisition period are subject to transfer.
transfer item
Below is a list of items transferred for each target and each data type.
data catalog
user defined metadata
There are two types of data types that can be transferred
- User-defined metadata (BigQuery table information):.
- Transfers table information metadata.
- User-defined metadata (BigQuery column information)
- Transfers column information metadata.
Only metadata for table columns for which at least one of the basic or user-defined metadata has already been entered will be transferred.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
bigquery_project_id | string | Metadata projects related to BigQuery |
bigquery_dataset | string | Dataset of metadata about BigQuery |
bigquery_table | string | BigQuery table name |
bigquery_column | string | BigQuery column name Only when user-defined metadata (BigQuery column information) is selected for the data type will it be transferred. |
trocco_metadata_logical_name | string | Logical name of basic metadata |
trocco_metadata_description | string | Basic Metadata Description |
trocco_metadata_last_updated_at | timestamp | Last update date and time of basic metadata |
(Fields defined in the template for user-defined metadata ) | (see below for details) | User-defined metadata values |
user_defined_metadata_last_updated_at | timestamp | Last modified date and time of user-defined metadata |
About Data Types
The data types defined in the template are converted to the following typesString
->string
Text(Markdown)
->string
Integer
->long
Boolean
->boolean
About column names
- If non-alphanumeric characters are used in the field name defined in the template for user-defined metadata, it may not be transferred depending on the destination connector.
- In the above case, change the column name to one consisting only of alphanumeric characters in the column definition in STEP 2 of the transfer setup.
column reference list
Forward the column reference list.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
bigquery_src_project_id | string | Metadata projects related to BigQuery (references) |
bigquery_src_dataset | string | Dataset of metadata about BigQuery (references) |
bigquery_src_table | string | BigQuery table name (reference source) |
bigquery_src_column | string | BigQuery column name (reference source) |
bigquery_dst_project_id | string | Metadata projects related to BigQuery (references) |
bigquery_dst_dataset | string | Dataset of metadata about BigQuery (references) |
bigquery_dst_table | string | BigQuery table names (references) |
bigquery_dst_column | string | BigQuery column names (references) |
creation_type | string | How to create a column referencetrocco_data_source user |
created_by | string | Email address of the column reference creator |
last_updated_at | timestamp | Last modified date and time of the column reference |
Transfer Settings
job execution history
Transfers historical data about previously executed transfer jobs.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
job_id | long | transfer job ID |
job_url | string | transfer job URL |
job_definition_id | long | The transfer setting ID from which the job originated |
job_definition_url | string | The transfer configuration URL from which the job originated |
job_definition_name | string | Name of the transfer setting from which the job originated |
executor_type | string | How to execute a transfer jobmanual:Manual executionScheduler : Scheduled executionworkflow : Executed as a workflow taskapi : Execution by APIjob_dependency : Execution by trigger job |
status | string | Transfer Job Statusqueued : waiting for executionsetting_up : preparing to runexecuting : Executinginterrupting : execution is interruptedSucceeded : Execution completed (success)error : Execution completed (error)canceled : Execution completed (canceled)skipped : execution completed (skipped) |
transfer_records | long | Number of records transferred |
transfer_bytes | long | null |
skipped_records | long | null |
started_at | timestamp | null |
finished_at | timestamp | null |
created_at | timestamp | Date and time the transfer job was executed ( %Y-%m-%d %H:%M:%S %Z) |
Data Mart Definition
job execution history
Transfers historical data on previously executed data mart jobs.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
datamart_job_id | long | DATAMART Job ID |
datamart_job_url | string | Data Mart Job URL |
datamart_definition_id | long | Data mart definition ID from which the job originated |
datamart_definition_url | string | The data mart definition URL from which the job originated |
datamart_definition_name | string | Name of the data mart definition on which the job was based |
executor_type | string | How to run a data mart jobmanual:Manual executionScheduler : Scheduled executionworkflow : Executed as a workflow taskjob_dependency : Execution by trigger job |
status | string | Data Mart Job Statusqueued : waiting for executionsetting_up : preparing to runexecuting : Executinginterrupting : execution is interruptedSucceeded : Execution completed (success)error : Execution completed (error)canceled : Execution completed (canceled)skipped : execution completed (skipped) |
transfer_records | long | Number of records transferred |
started_at | timestamp | null |
finished_at | timestamp | null |
created_at | timestamp | Date and time of execution of the data mart job ( %Y-%m-%d %H:%M:%S %Z) |
Workflow Definition
job execution history
Transfers historical data on previously executed workflow jobs.
column name | type (e.g. of machine, goods, etc.) | Description. |
---|---|---|
pipeline_job_item_id | long | Unique ID associated with workflow job ID |
pipeline_job_id | long | workflow job ID |
pipeline_job_url | string | workflow job URL |
pipeline_definition_id | long | Workflow definition ID from which the workflow job originated |
pipeline_definition_url | string | The original workflow definition URL for the workflow job |
pipeline_definition_name | string | Name of the workflow definition from which the workflow job originated |
executor_type | string | How to run a workflow jobmanual:Manual executionScheduler : Scheduled executionworkflow : Executed as a workflow taskapi : Execution by APIretry : Retry execution (reexecution from the stop position or reexecution by retry setting) |
canceled_type | string | null |
status | string | Workflow Job Statusqueued : waiting for executionsetting_up : preparing to runexecuting : Executinginterrupting : execution is interruptedSucceeded : Execution completed (success)error : Execution completed (error)canceled : Execution completed (canceled)skipped : execution completed (skipped)retry_waiting : Waiting state from execution completion (error) state to start retry (only if retry count is set) |
started_at | timestamp | null |
finished_at | timestamp | null |
created_at | timestamp | Workflow job execution date and time ( %Y-%m-%d %H:%M:%S %Z) |