Datamart - Snowflake
    • PDF

    Datamart - Snowflake

    • PDF

    Article summary

    summary

    Help page for setting up Data Mart Configuration using Snowflake.

    constraints

    • Only Connection Configuration for which Key Pair Authentication is selected as the authentication method can be used with Datamart Snowflake.

    Setting items

    STEP1 Basic settings

    item nameindispensabledefault valueContents
    Snowflake Connection ConfigurationYes-Select the preregistered Snowflake Connection Configuration that has the necessary permissions for this Data Mart Configuration.
    warehouseYes-Specify the warehouse to be used for processing.
    Custom VariableNo-Custom Variables set here can be used for query and table names.

    Query settings

    item nameindispensabledefault valueContents
    query execution modeYesData Transfer ModeYou can choose from the following two modes
    Data transfer mode
     You can easily perform rewashing and appending to tables by simply specifying SQL and the destination table.
    Free description mode
     You are free to execute any query (DDL, DELETE, INSERT, etc.) against the DWH to which you are connected.
    queryYes-Enter SQL to retrieve transfer data from Snowflake.
    Custom Variables can also be used to dynamically determine the value of a setting during ETL Configuration of TROCCO's data.

    In addition, the following ETL Configuration can be specified only when the query execution mode is Data Setting mode.

    Data Setting

    item nameindispensabledefault valueContents
    Output Destination DatabaseYes-Specify the database to which the data will be output.
    Specify a database that exists.
    destination schemaYes-Specifies the name of the schema to which the data will be output.
    Data set names must consist of only letters, numbers, and underscores.
    Specify a schema that exists.
    output tableYes-Specify the name of the table to which the data will be output.
    Table names must consist of only letters, numbers, and underscores.
    If the table does not exist, a new one will be created at the time of transfer.
    Write settings for output destination tableYes-You can choose from the following writing methods
  • Append: The result of the query execution is appended after the records of the existing table.
  • All records are TRUNCATED(TRUNCATE INSERT): records in the existing table are TRUNCATED and replaced with the results of the query execution. In this case, the schema of the existing table is not deleted.
  • REPLACE ALL: Existing tables are DROPPED and replaced with the results of the query execution. In this case, the schema of the existing table will be deleted.
  • Job Setting

    item nameindispensabledefault valueContents
    Parallel execution of jobsNo parallel job execution.Select whether or not to run a job if another job with the same Data Mart Configuration is running at the time the job is run.
  • No parallel job execution: jobs are not executed and skipped.
  • Allow jobs to run in parallel: Jobs will run.

  • Was this article helpful?