Transfer source - TROCCO
  • 17 Jul 2024
  • PDF

Transfer source - TROCCO

  • PDF

Article summary

summary

Help page for settings to transfer data held in TROCCO.
The transferring source TROCCO can transfer any of the following data

  • data catalog
    • Metadata held in data catalogs can be transferred.
  • Transfer settings, data mart definition, workflow definition
    • Historical data on previously executed jobs can be transferred.

constraints

About Data Catalog

Transferring data from the data catalog requires activation of the data catalog.

データカタログは有償オプションとなっております。
トライアルを希望される場合や、ご契約を希望される場合は、営業担当者またはカスタマーサクセスまでお問い合わせください。

Period during which job execution history can be obtained

The historical data of jobs executed up to one year in the past can be transferred, starting from the date and time when the job execution history transfer is executed.
Historical data for jobs executed before that time cannot be transferred.

Setting items

STEP1 Basic settings

item nameindispensableContents
subject (of taxation, etc.)YesSelect the target to be transferred from the following
  • data catalog
  • Transfer Settings
  • Data Mart Definition
  • Workflow Definition
  • data typeYesSelect the data type to be transferred from the following
    • If you select Data Catalog
      • User-defined metadata (BigQuery table information)
      • User-defined metadata (BigQuery column information)
      • column reference list
    • When transfer settings, data mart definitions, and workflow definitions are selected
      • job execution history

    When transfer settings, data mart definitions, and workflow definitions are selected

    item nameindispensableContents
    Data acquisition periodYesEnter the start and end date and time of the data you wish to transfer.
    Please enter in YYYYY-MM-DD or YYYYY-MM-DD HH:mm:ss format.
    If HH:mm:ss is not specified, 00:00:00 is specified instead.
    time zoneYesSelect the time zone for the data acquisition period.
    Criteria for data acquisition period

    The data acquisition period is based on the execution date/time ( created_at ) of each job.
    Therefore, the historical data of jobs whose created_at values are within the data acquisition period are subject to transfer.

    transfer item

    Below is a list of items transferred for each target and each data type.

    data catalog

    user defined metadata

    There are two types of data types that can be transferred

    • User-defined metadata (BigQuery table information):.
    • User-defined metadata (BigQuery column information)
    Table for metadata transfer

    Only metadata for table columns for which at least one of the basic or user-defined metadata has already been entered will be transferred.

    column nametype (e.g. of machine, goods, etc.)Description.
    bigquery_project_idstringMetadata projects related to BigQuery
    bigquery_datasetstringDataset of metadata about BigQuery
    bigquery_tablestringBigQuery table name
    bigquery_columnstringBigQuery column name
    Only when user-defined metadata (BigQuery column information) is selected for the data type will it be transferred.
    trocco_metadata_logical_namestringLogical name of basic metadata
    trocco_metadata_descriptionstringBasic Metadata Description
    trocco_metadata_last_updated_attimestampLast update date and time of basic metadata
    (Fields defined in the template for user-defined metadata )(see below for details)User-defined metadata values
    user_defined_metadata_last_updated_attimestampLast modified date and time of user-defined metadata
    Field defined in template for user-defined metadata
    • About Data Types
      The data types defined in the template are converted to the following types

      • String -> string
      • Text(Markdown) -> string
      • Integer -> long
      • Boolean -> boolean
    • About column names

      • If non-alphanumeric characters are used in the field name defined in the template for user-defined metadata, it may not be transferred depending on the destination connector.
      • In the above case, change the column name to one consisting only of alphanumeric characters in the column definition in STEP 2 of the transfer setup.

    column reference list

    Forward the column reference list.

    column nametype (e.g. of machine, goods, etc.)Description.
    bigquery_src_project_idstringMetadata projects related to BigQuery (references)
    bigquery_src_datasetstringDataset of metadata about BigQuery (references)
    bigquery_src_tablestringBigQuery table name (reference source)
    bigquery_src_columnstringBigQuery column name (reference source)
    bigquery_dst_project_idstringMetadata projects related to BigQuery (references)
    bigquery_dst_datasetstringDataset of metadata about BigQuery (references)
    bigquery_dst_tablestringBigQuery table names (references)
    bigquery_dst_columnstringBigQuery column names (references)
    creation_typestringHow to create a column reference
  • For automatic definition: trocco_data_source
  • If user-defined: user
  • created_bystringEmail address of the column reference creator
    last_updated_attimestampLast modified date and time of the column reference

    Transfer Settings

    job execution history

    Transfers historical data about previously executed transfer jobs.

    column nametype (e.g. of machine, goods, etc.)Description.
    job_idlongtransfer job ID
    job_urlstringtransfer job URL
    job_definition_idlongThe transfer setting ID from which the job originated
    job_definition_urlstringThe transfer configuration URL from which the job originated
    job_definition_namestringName of the transfer setting from which the job originated
    executor_typestringHow to execute a transfer job
  • manual:Manual execution
  • Scheduler: Scheduled execution
  • workflow: Executed as a workflow task
  • api: Execution by API
  • job_dependency: Execution by trigger job
  • statusstringTransfer Job Status
  • queued: waiting for execution
  • setting_up: preparing to run
  • executing: Executing
  • interrupting: execution is interrupted
  • Succeeded: Execution completed (success)
  • error: Execution completed (error)
  • canceled: Execution completed (canceled)
  • skipped: execution completed (skipped)
  • transfer_recordslongNumber of records transferred
    transfer_byteslongnull
    skipped_recordslongnull
    started_attimestampnull
    finished_attimestampnull
    created_attimestampDate and time the transfer job was executed ( %Y-%m-%d %H:%M:%S %Z)

    Data Mart Definition

    job execution history

    Transfers historical data on previously executed data mart jobs.

    column nametype (e.g. of machine, goods, etc.)Description.
    datamart_job_idlongDATAMART Job ID
    datamart_job_urlstringData Mart Job URL
    datamart_definition_idlongData mart definition ID from which the job originated
    datamart_definition_urlstringThe data mart definition URL from which the job originated
    datamart_definition_namestringName of the data mart definition on which the job was based
    executor_typestringHow to run a data mart job
  • manual:Manual execution
  • Scheduler: Scheduled execution
  • workflow: Executed as a workflow task
  • job_dependency: Execution by trigger job
  • statusstringData Mart Job Status
  • queued: waiting for execution
  • setting_up: preparing to run
  • executing: Executing
  • interrupting: execution is interrupted
  • Succeeded: Execution completed (success)
  • error: Execution completed (error)
  • canceled: Execution completed (canceled)
  • skipped: execution completed (skipped)
  • transfer_recordslongNumber of records transferred
    started_attimestampnull
    finished_attimestampnull
    created_attimestampDate and time of execution of the data mart job ( %Y-%m-%d %H:%M:%S %Z)

    Workflow Definition

    job execution history

    Transfers historical data on previously executed workflow jobs.

    column nametype (e.g. of machine, goods, etc.)Description.
    pipeline_job_item_idlongUnique ID associated with workflow job ID
    pipeline_job_idlongworkflow job ID
    pipeline_job_urlstringworkflow job URL
    pipeline_definition_idlongWorkflow definition ID from which the workflow job originated
    pipeline_definition_urlstringThe original workflow definition URL for the workflow job
    pipeline_definition_namestringName of the workflow definition from which the workflow job originated
    executor_typestringHow to run a workflow job
  • manual:Manual execution
  • Scheduler: Scheduled execution
  • workflow: Executed as a workflow task
  • api: Execution by API
  • retry: Retry execution (reexecution from the stop position or reexecution by retry setting)
  • canceled_typestringnull
    statusstringWorkflow Job Status
  • queued: waiting for execution
  • setting_up: preparing to run
  • executing: Executing
  • interrupting: execution is interrupted
  • Succeeded: Execution completed (success)
  • error: Execution completed (error)
  • canceled: Execution completed (canceled)
  • skipped: execution completed (skipped)
  • retry_waiting: Waiting state from execution completion (error) state to start retry (only if retry count is set)
  • started_attimestampnull
    finished_attimestampnull
    created_attimestampWorkflow job execution date and time ( %Y-%m-%d %H:%M:%S %Z)

    Was this article helpful?