- Tutorials
- ETL/ELT and Orchestration
- Connection Configurations
- A8.net Connection Configuration
- Connection Configuration for Adobe Marketo Engage
- Amazon DynamoDB Connection Configuration
- Connection Configuration for Amazon Redshift
- Apple Search Ads Connection Configuration
- Connection Configuration for AppsFlyer
- Connection Configuration for Azure Blob Storage
- Connection Configuration for Azure Synapse Analytics
- Connection Configuration for Google BigQuery
- Connection Configuration for Box
- Connection Configuration for Databricks
- Connection Configuration for e-Sales Manager
- Connection Configuration for Facebook Ad Insights
- Facebook Ad Reporting, Conversions, Offline Conversions, Custom Audience Application Creation and Token Acquisition Procedures for Custom Audiences
- freee accounting connection information
- Connection Configuration for GitHub
- Connection Configuration for Google Ad Manager
- Connection Configuration for Google Analytics
- Connection Configuration for Google Analytics 4
- Connection Configuration for Google Cloud Spanner
- Connection Configuration for Google Cloud Storage
- Connection Configuration for Google Drive
- Connection Configuration for Google Play
- Connection Configuration for Google Search Console
- Connection Configuration for Google Sheets
- Connection Configuration for Google Ads
- HTTP/HTTPS Connection Configuration
- HubSpot Connection Configuration
- Connection Configuration for JIRA
- Connection Configuration for KARTE Datahub
- Connection Configuration for kintone
- Connection Configuration for LINE Ads
- Connection Configuration for MongoDB
- Connection Configuration for Microsoft Advertising
- Connection Configuration for Microsoft SQL Server
- Connection Configuration for MySQL
- Connection Configuration for Oracle Database
- Connection Configuration for PostgreSQL
- Connection Configuration for RTB House
- Rtoaster insight+ with Google Account integration Connection Configuration
- S3 Connection Configuration
- Connection Configuration for Salesforce Marketing Cloud Account Engagement
- Connection Configuration for Salesforce
- Connection Configuration for SHANON MARKETING PLATFORM
- Connection Configuration for Slack
- SmartHRの接続情報
- Connection Configuration for Snowflake
- Tableau Connection Configuration
- Connection Configuration for ValueCommerce
- Connection Configuration for X Ads (Formerly Twitter Ads)
- Connection Configuration for Yahoo! Search Ads and Display Ads (managed)
- Connection Configuration for Zendesk Support
- Connection Configuration for Zoho CRM
- Connection Configuration for Next SFA
- Notes on referencing BigQuery tables generated from SpreadSheets
- Connecting via SSH Tunnel
- Obtain Google Cloud JSON Key
- ETL Configurations
- ETL Configuration List
- Data Sources
- Data Source - A8.net
- Data Source - Adobe Marketo Engage
- Data Source - Amazon Aurora MySQL
- Data Source - Amazon CloudWatch Logs
- Data Source - Amazon DynamoDB
- Data Source - Amazon Redshift
- Data Source - Amazon S3
- Data Source - App Store Connect API
- Data Source - Apple Search Ads
- Data Source - AppsFlyer
- Data Source - Amazon Athena
- Data Source - Azure Blob Storage
- Data Source - Google BigQuery
- Data Source - Box
- Data Source - Cisco Secure Endpoint
- Data Source - Criteo
- Data Source - CrowdStrike
- Data Source - Databricks
- Data Source - Elasticsearch
- Data Source - e-Sales Manager
- Data Source - Facebook Ad Creative
- Data Source - Facebook Lead Ads
- Data Source - Facebook Ad Insights
- Transfer source - freee accounting
- Data Source - FTP/FTPS
- Data Source - GitHub GraphQL API
- Data Source - Google Ad Manager
- Data Source - Google Ads
- Data Source - Google AdSense
- Data Source - Google Analytics
- Data Source - Google Analytics 4
- Data Source - Google Cloud Spanner
- Data Source - Google Cloud Storage
- Data Source - Google Drive
- Data Source - Google Play
- Data Source - Google Search Console
- Data Source - Google Sheets
- Data Source - HTTP/HTTPS
- Data Source - DataSpot
- Data Source - JIRA
- Data Source - KARTE Datahub
- Data Source - kintone
- Data Source - LINE Ads
- Data Source - Microsoft SQL Server
- Data Source - Microsoft Advertising
- Data Source - MongoDB
- Data Source - MySQL
- Data Source - MySQL binlog (CDC)
- Data Source - Oracle Database
- Data Source - PostgreSQL
- Data Source - RDBMS Version Support Chart
- Data Source - Repro
- Data Source - Rtoaster insight+ with Google Account integration
- Data Source - RTB House
- Data Source - Salesforce Report
- Data Source - Salesforce
- Data Source - Salesforce Marketing Cloud Account Engagement
- Data Source - SFTP
- Data Source - Shopify
- Data Source - SHANON MARKETING PLATFORM
- Data Source - Slack
- 転送元 - SmartHR
- Data Source - Snowflake
- Data Source - Tableau CRM Analytics
- Data Source - TikTok Ads
- Data Source - Treasure Data
- Data Source - X Ads (Formerly Twitter Ads)
- Data Source - TROCCO Web Activity Log
- Data Source - TROCCO
- Data Source - Yahoo! Display Ads (Managed)
- Data Source - ValueCommerce
- Data Source - Yahoo! Search Ads
- Data Source - Zendesk Support
- Data Source - Zoho CRM
- Data Source - Next SFA
- Data Source - Local Files
- Data Destinations
- Data Destination - Adobe Marketo Engage
- Data Destination - Amazon Redshift
- Data Destination - Amazon S3
- Data Destination - Amazon S3 Parquet
- Data Destination - Azure Blob Storage
- Data Destination - Azure Synapse Analytics
- Data Destination - Google BigQuery
- Data Destination - Box
- Data Destination - Braze
- Data Destination - Databricks
- Data Destination - e-Sales Manager
- Data Destination - Facebook Offline Conversions
- Data Destination - Facebook Custom Audience(Beta Feature)
- Data Destination - Facebook Conversions API
- Data Destination - FTP/FTPS
- Data Destination - Google Ads Conversions
- Data Destination - Google Analytics 4 Measurement Protocol
- Data Destination - Google Analytics Measurement Protocol
- Data Destination - Google Cloud Storage
- Data Destination - Google Drive
- Data Destination - Google Sheets
- Data Destination - Google Offline Conversion
- Data Destination - Data Destination - HubSpot
- Data Destination - KARTE Datahub
- Data Destination - kintone
- Data Destination - LINE Conversion API
- Data Destination - Microsoft SQL Server
- Data Destination - MySQL
- Data Destination - PostgreSQL
- Data Destination - Rtoaster insight+.
- Data Destination - Salesforce
- Data Destination - Salesforce Marketing Cloud
- Data Destination - SFTP
- Data Destination - Snowflake
- Data Destination - Treasure Data
- Data Destination - X Ads (Formerly Twitter Ads) Web Conversions
- Data Destination - Yahoo! JAPAN Ads Display Ads Conversion Measurement API (Beta Feature)
- Data Destination - Zoho CRM
- Common Settings
- Other Settings
- Managed ETL Configurations
- Data Mart
- dbt Integration
- Workflow
- Useful Features
- Connection Configurations
- Data Management
- Data Catalog
- About Data Catalog
- How to Start Using
- Understanding Data (Table Information)
- View and analyze relationships between data (ER diagram, JOIN analysis)
- Defining Data Relationships (see Column Setting)
- Finding data (table/column search)
- Write a query (Query Editor)
- Data Catalog Setting
- Data Store Coordination Management
- List of functions supported by each data store
- Automatic asset capture function
- Automatic metadata acquisition function
- Data Catalog Glossary
- Data Catalog
- Teams
- System Management
- Account Management
- TROCCO Web Activity Log
- FAQ
- Error Handling
- Other General FAQ
- How to resolve the error due to the high volume of transfers from Data Source BigQuery
- How to get the value by specifying the element when the Data Source column's JSON contains an array.
- Can I speed up the transfer rate?
- Retention period for logs of ETL Job system in TROCCO
- How to specify the time zone in the Data Destination column representing the time
- About transferable capacity
- Use of BigQuery Scripts when Creating Data Mart Configuration
- About file compression settings
- How to create a SOQL from the developer console
- Access rights to the created ETL Configuration, Data Mart Configuration, etc.
- Custom Variable Loop Execution
- Efficiently update master tables on DWH without duplication
- Google API Disclosure
- recommended system requirements (software)
- Detailed information about your plan
- More information about Professional Plans
- Maintenance with Downtime
- Release Notes
- List of Release Notes
- リリースノート-2024年11月
- Release Notes - October 2024
- Release Notes - Sep 2024
- Release Notes - August 2024
- Release Notes - July 2024
- Release Notes - Jun 2024
- Release Notes - May 2024
- Release Notes - Apr 2024
- Release Notes - Mar 2024
- Release Notes - Feb 2024
- Release Notes - January 2024
- Release Notes - December 2023
- Release Notes - November 2023
- Release Notes - October 2023
- Release Notes - Sep 2023
- Release Notes - August 2023
- Release Notes - July 2023
- Release Notes - Jun 2023
- Release Notes - May 2023
- Release Notes - Apr 2023
- Release Notes - Mar 2023
- Release Notes - Feb 2023
- Release Notes - January 2023
- Release Notes - December 2022
- Release Notes - November 2022
- Release Notes - October 2022
- Release Notes - Sep. 2022
- Release Notes - August 2022
- Release Notes - July 2022
- Release Notes - Jun 2022
- Release Notes - May 2022
- Release Notes - Apr 2022
- Release Notes - Mar 2022
- Announcement
- Terms of Service
summary
This is the help page for setting up Connection Configuration for Amazon Redshift.
Authority required for Connection Configuration used for Data Source
It is necessary to create a user with the following permissions
SELECT TABLE
authority inData
Source schema
To grant minimum privileges to users, please refer to the following SQL command.
ALTER DEFAULT PRIVILEGES FOR USER <your_db_user_name> IN SCHEMA <your_destination_schema>
GRANT
SELECT
ON TABLES TO <your_db_user_name>;
GRANT USAGE ON SCHEMA <your_destination_schema> TO <your_db_user_name>
Authority required for Connection Configuration used for Data Destination
To transfer data to Amazon Redshift at high speed, TROCCO first temporarily stores data in Amazon S3 and then bulk loads the data into Amazon Redshift using the COPY command.
In order to go through the above transfer process, Connection Configuration used for Data Destination Amazon Redshift must have permission settings for both Amazon S3 and Amazon Redshift.
Setting up permissions for Amazon S3 (creating an IAM user)
Data may be temporarily stored in Amazon S3 for data transfer to Amazon Redshift.
The IAM User credentials are used for this purpose.
Create an IAM User with the following permissions and enter AWS Credentials
s3:GetObject
s3:PutObject
s3:DeleteObject
s3:ListBucket
* Grant permission to list some or all of the objects in an S3 bucket.s3:ListAllMyBuckets
* Grant permission to list all buckets owned by the authenticated sender of the requests3:GetBucketLocation
(for buckets in different regions)sts:GetFederationToken
* Grant the right to retrieve the federated user's temporary security credentials (consisting of an access key ID, secret access key and security token)
To grant minimum privileges, please refer to the following policy.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "S3Permissions",
"Effect": "Allow",
"Action": [
"s3:AbortMultipartUpload",
"s3:DeleteObject",
"s3:GetBucketLocation",
"s3:GetObject",
"s3:ListAllMyBuckets",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::<YOUR_DESTINATION_BUCKET_NAME>",
"arn:aws:s3:::<YOUR_DESTINATION_BUCKET_NAME>/*"
]
},
{
"Sid": "STSPermissions",
"Effect": "Allow",
"Action": [
"sts:GetFederationToken"
],
"Resource": "*"
}
]
}
Setting up permissions (creating users) for Amazon Redshift
It is necessary to create a user with the following permissions
CREATE/DROP TABLE
permission inData
Destination schema- Authority to execute
COPY
command in Data Destination schema
To grant minimum privileges to users, please refer to the following SQL command.
ALTER DEFAULT PRIVILEGES FOR USER <your_db_user_name> IN SCHEMA <your_destination_schema>
GRANT
SELECT,
INSERT,
DELETE
ON TABLES TO <your_db_user_name>;
GRANT USAGE ON SCHEMA <your_destination_schema> TO <your_db_user_name>