Data Mart - Snowflake
  • 17 Jul 2024
  • PDF

Data Mart - Snowflake

  • PDF

Article summary

summary

Help page for setting up data mart definitions using Snowflake.

constraints

  • Only connection information for which key pair authentication is selected as the authentication method can be used in Datamart Snowflake.

Setting items

STEP1 Basic settings

item nameindispensabledefault valueContents
Snowflake Connection InformationYes-From the pre-registered Snowflake connection information, select the one that has the necessary permissions for this data mart definition.
warehouseYes-Specify the warehouse to be used for processing.
custom variableNo-Custom variables set here can be used for query and table names.

Query settings

item nameindispensabledefault valueContents
query execution modeYesData Transfer ModeYou can choose from the following two modes
Data transfer mode
 You can easily perform rewashing and appending to tables by simply specifying SQL and the destination table.
Free description mode
 You are free to execute any query (DDL, DELETE, INSERT, etc.) against the DWH to which you are connected.
queryYes-Enter SQL to retrieve transfer data from Snowflake.
Custom variables can be used to dynamically determine set values during TROCCO data transfer.

In addition, the following settings can be specified only when the query execution mode is data transfer mode.

Data output destination setting

item nameindispensabledefault valueContents
Output Destination DatabaseYes-Specify the database to which the data will be output.
Specify a database that exists.
destination schemaYes-Specifies the name of the schema to which the data will be output.
Data set names must consist of only letters, numbers, and underscores.
Specify a schema that exists.
output tableYes-Specify the name of the table to which the data will be output.
Table names must consist of only letters, numbers, and underscores.
If the table does not exist, a new one will be created at the time of transfer.
Write settings for output destination tableYes-You can choose from the following writing methods
  • Append: The result of the query execution is appended after the records of the existing table.
  • All records are TRUNCATED(TRUNCATE INSERT): records in the existing table are TRUNCATED and replaced with the results of the query execution. In this case, the schema of the existing table is not deleted.
  • REPLACE ALL: Existing tables are DROPPED and replaced with the results of the query execution. In this case, the schema of the existing table will be deleted.
  • Job startup settings

    item nameindispensabledefault valueContents
    Parallel execution of jobsNo parallel job execution.Select whether or not to run a job if another job with the same data mart definition is running at the time the job is run.
  • No parallel job execution: jobs are not executed and skipped.
  • Allow jobs to run in parallel: Jobs will run.

  • Was this article helpful?