Datamart - Snowflake

Prev Next

summary

Help page for setting up Data Mart Configuration using Snowflake.

constraints

  • Only Connection Configuration for which Key Pair Authentication is selected as the authentication method can be used with Datamart Snowflake.

Setting items

STEP1 Basic settings

item name indispensable default value Contents
Snowflake Connection Configuration Yes - Select the preregistered Snowflake Connection Configuration that has the necessary permissions for this Data Mart Configuration.
warehouse Yes - Specify the warehouse to be used for processing.
Custom Variable No - Custom Variables set here can be used for query and table names.

Query settings

item name indispensable default value Contents
query execution mode Yes Data Transfer Mode You can choose from the following two modes
Data transfer mode
 You can easily perform rewashing and appending to tables by simply specifying SQL and the destination table.
Free description mode
 You are free to execute any query (DDL, DELETE, INSERT, etc.) against the DWH to which you are connected.
query Yes - Enter SQL to retrieve transfer data from Snowflake.
Custom Variables can also be used to dynamically determine the value of a setting during ETL Configuration of TROCCO's data.

In addition, the following ETL Configuration can be specified only when the query execution mode is Data Setting mode.

Data Setting

item name indispensable default value Contents
Output Destination Database Yes - Specify the database to which the data will be output.
Specify a database that exists.
destination schema Yes - Specifies the name of the schema to which the data will be output.
Data set names must consist of only letters, numbers, and underscores.
Specify a schema that exists.
output table Yes - Specify the name of the table to which the data will be output.
Table names must consist of only letters, numbers, and underscores.
If the table does not exist, a new one will be created at the time of transfer.
Write settings for output destination table Yes - You can choose from the following writing methods
  • Append: The result of the query execution is appended after the records of the existing table.
  • All records are TRUNCATED(TRUNCATE INSERT): records in the existing table are TRUNCATED and replaced with the results of the query execution. In this case, the schema of the existing table is not deleted.
  • REPLACE ALL: Existing tables are DROPPED and replaced with the results of the query execution. In this case, the schema of the existing table will be deleted.
  • Job Setting

    item name indispensable default value Contents
    Parallel execution of jobs No parallel job execution. Select whether or not to run a job if another job with the same Data Mart Configuration is running at the time the job is run.
  • No parallel job execution: jobs are not executed and skipped.
  • Allow jobs to run in parallel: Jobs will run.