List of Release Notes
    • PDF

    List of Release Notes

    • PDF

    Article summary

    This page contains weekly releases.

    2024-07-29

    UI・UX

    ETL Configuration list can now be filtered by Connection Configuration type.

    In the last update, filtering by Connector was supported.
    In response to the above, this update supports filtering by individual Connection Configuration.
    This makes it easy to see which Connection Configuration is used for any given ETL Configuration.

    release-notes-2024-08-29-6-58-4

    2024-07-22

    UI・UX

    Changes to user management screens

    The design of the user management screen has been changed.
    This allows each user's permissions (operations allowed on TROCCO) to be checked at a glance.
    When used in conjunction with COMETA, users can also be narrowed down.

    release-notes-2024-08-29-6-58-5

    2024-07-16

    ETL Configuration

    Data Source Databricks

    New Data Source Databricks added.
    For more information, see Data Destination - Databricks.

    Notice

    Increased memory size for ETL Job

    The release within the week of 07/16/2024 will increase the memory size used for data transfer.

    • Memory size before change: 2 GiB
    • Modified memory size: 6 GiB

    This change applies to ETL Configurations created after the above release.
    Since the ETL Job will be up-specified, performance may be improved compared to the job before the change.
    However, the following Connectors are exceptions to the current memory size of 15 GiB, and the 15 GiB will continue to apply after this change.

    List of Connectors with exceptional memory size of 15 GiB

    Elimination of Direct and Aggregate transfer functions

    The following transfer functions have been eliminated

    • Direct Transfer (selectable when Data Source Amazon S3 -> Data Destination SFTP combination is used)
    • Aggregate Transfer (selectable when Data Source Google BigQuery -> Data Destination Amazon S3 combination)

    ETL Configuration

    When an OutOfMemoryError occurs, the execution log will clearly indicate this.

    When an OutOfMemoryError occurs, it is now clearly indicated in the execution log.
    If this message is displayed, please refer to the section on what to do when OutOfMemoryError occurs.

    Input restrictions added for Data Source HTTP and HTTPS

    ETL Configuration STEP2 > Input Option now has an upper and lower limit for the value that can be entered for each setting item.
    For more information, see Data Source - HTTP/HTTPS.

    2024-07-01

    Notice

    Google Analytics (Universal Analytics) Discontinued Support

    Google Analytics (Universal Analytics) Service Termination

    In response to Google's discontinuation of Universal Analytics, the following Connectors will be discontinued on July 01, 2024.

    • Data Source - Google Analytics
    • Data Destination - Google Analytics Measurement Protocol

    After 07/01/2024, it will no longer be possible to create new ETL Configuration and Connection Configuration. Also, running a Job from ETL Configuration will result in an error.

    Please consider switching to Google Analytics 4 and using Data Source - Google Analytics 4 and Google Analytics 4 Connection Configuration in the future.

    2024-06-24

    ETL Configuration

    Data Destination Databricks

    New Data Destination Databricks added.
    For more information, see Data Destination - Databricks.

    2024-06-17

    Notice

    Limit job execution by the maximum number of simultaneous executions

    TROCCO limits the number of jobs that can run simultaneously within an account.
    Following the rate plan change in 04/2024, data mart jobs are now also subject to this limitation.
    For more information on this limitation, please refer to Job Concurrency Limit.

    dbt linkage

    Compatible with dbt versions 1.7 and 1.8

    dbt Core v1.7 and dbt Core v1.8 can now be specified.
    The dbt version can be selected from the dbt Git repository.

    2024-06-10

    Notice

    Restrictions on Job Execution in the Free Plan

    If you are using the Free plan, you can no longer run jobs when the cumulative monthly processing time exceeds the processing time quota.
    The accumulated processing time returns to 0 hours at midnight (UTC+9) on the first day of the following month. If any jobs were not executed, they should be rerun in the following month.

    ETL Configuration

    Record ID can be specified as update key in update/upsert of Data Destination kintone.

    Record IDs can now be specified as update keys.
    If you wish to specify a record ID, enter $id as the update key.

    2024-06-03

    API Update

    Data Destination Google Ads Conversions

    Regarding extended conversions, the version of the Google Ads API used for transfer has been updated from v14.1 to v16.

    Please refer to the Google Ads API documentation for information on the new version.

    2024-05-27

    Data Mart Configuration

    Datamart Snowflake

    The write setting for the output destination table can now be selected between TRUNCATE INSERT and``REPLACE for the all-wash mode.

    • In the case of TRUNCATE INSERT, the schema of the existing table is not deleted.
    • In the case of REPLACE, the schema of the existing table is deleted.

    For more information on the difference between the two, see Data Mart - Snowflake.

    API Update

    Data Source Google Ads / Data Destination Google Ads Conversions

    The version of Google Ads API used in the transfer was updated from v14.1 to v16.

    Regarding Data Destination Google Ads Conversions, only offline conversions have been updated.
    Extended conversion updates will be available next week.

    Please refer to the Google Ads API documentation for information on the new version.

    Search Ads and Data Source Yahoo! Search Ads and Data Source Yahoo! Display Ads (managed)

    The version of YahooAdsAPI used for transfer has been updated from v11 to v12.
    Please refer to the following documents for each new version

    2024-05-20

    Notice

    API Update for Data Source Google Ads

    On Wednesday, May 22, 2024, between 10:00 am and 4:00 pm, there will be an update to the Google Ads API used by Data Source Google Ads.
    Some disruptive changes will occur with the API update.
    Please see 2024/05/16 Destructive Changes to Data Source Google Ads for more information on the resources/fields that will be removed or changed and what to do about it.

    2024-05-13

    Notice

    On May 9, 2024, we announced our brand renewal.
    With the rebranding, the logotype of the product was changed from TROCCO``to TROCCO, as well as the color scheme of the logo.

    release-notes-2024-08-29-6-58-6

    The new logo image files and logo guidelines are available on our website.
    Also, please refer to the press release regarding the brand renewal.

    Connection Configuration

    Allow SSH private key passphrase to be entered in Microsoft SQL Server Connection Configuration.

    Added SSH private key passphrase to the configuration item.
    This allows you to connect to Microsoft SQL Server with a private key with a passphrase.

    2024-04-30

    security

    TROCCO API is now restricted by IP address when executed.

    Execution of the TROCCO API is now subject to IP address restrictions.
    This allows for more secure use of the TROCCO API.

    Requests due to this specification change

    If you are already using the TROCCO API and have set Account Security >AllowedIP Addresses for at least one IP address, you must add the IP address used to run the TROCCO API to the Allowed IP Addresses.

    Data Mart Configuration

    Data Mart Azure Synapse Analytics

    New data mart Azure Synapse Analytics has been added.
    For more information, see Data Mart - Azure Synapse Analytics.

    ETL Configuration

    Data Source Google Sheets columns can be extracted.

    Previously, it was necessary to manually enter the name and data type of the column to be retrieved in ETL Configuration STEP 1.
    In contrast, the ability to extract column information from a spreadsheet to be transferred has been added.

    After entering the various setting items, click Extract Column Information to automatically set the column name and Data Setting.
    With the addition of the above functionality, an entry field has been added to specify the starting column number for capturing.
    For more information, see Data Source - Google Spreadsheets.

    2024-04-22

    ETL Configuration

    Search Ads and Data Source Yahoo! Search Ads and Data Source Yahoo! Display Ads (managed)

    Due to the discontinuation of YahooAdsAPI v11, the Base Account ID entry field has been added to the following Data Source Connector Configuration in order to update the version of Yahoo!

    • Data Source Yahoo! Search Ads
    • Data Source Yahoo! Display Ads (managed)

    For details, see "MCC Multi-Tiered" in v12 Upgrade Information.

    The transition to v12 is scheduled for mid-May 2024.
    As soon as the migration to v12 is complete, ETL Job Settings with a Base Account ID not yet entered will be marked as an error.
    Please edit your existing ETL Configuration prior to v12 migration.

    2024-04-15

    ETL Configuration

    Data Destination kintone to be able to transfer to table

    Data can now be transferred to tables (formerly subtables) in the kintone application.
    For details on how to transfer, please refer to Updating Tables (formerly Subtables) in the Data Destination kintone application.

    Data Source Google BigQuery allows users to select "Bucket Only" for temporary data export specification.

    When transferring data from Data Source Google BigQuery, data is temporarily output to Google Cloud Storage.
    Only buckets can now be specified as the output destination for temporary data in this case.

    Note that the conventional format of entering a Google Storage URL will output temporary data to the same path, unless Custom Variables are used.
    As a result, data on Google Cloud Storage could be overwritten.
    On the other hand, if only buckets are specified, an internally unique path is created and temporary data is output to that path.
    This avoids the aforementioned situation where data in Google Cloud Storage is overwritten and deleted.

    2024-04-08

    UI・UX

    Allow organization name to be set in TROCCO account

    You can now set an organization name for your TROCCO account.
    The organization name makes it easier to identify which TROCCO account you are logging into if you are managing multiple TROCCO accounts, for example.
    For more information, see About Organization Names.

    Managed ETL

    Add Amazon Redshift as a Data Destination

    Amazon Redshift can now be selected as a Managed ETL Data Destination.
    ETL Configuration, which retrieves Data Sources from a batch of Data Sources and transfers them to Amazon Redshift, can be created and managed centrally.

    API Update

    Data Source Google Ad Manager

    The version of the Google Ad Manager API used during transfer has been updated from v202305 to v202311.
    For more information on the new version, see Google Ad Manager API.

    2024-04-01

    Notice

    Effective April 1, 2024, the rate plan will be revised.
    For details, please refer to the fee plan.

    ETL Configuration

    Data Source Shopify supports retrieval of collections.

    Data Source Shopify targets can now select a collection object.
    See Data Source - Shopify for more information, including each item to be retrieved.

    Additional types can be specified in Data Destination Amazon Redshift

    The following items have been added to the Data Type in the STEP2 Output Option Column Setting for ETL Configuration Amazon Redshift.

    • TIME
    • DATE

    Update Processing Configuration for NULL values transferred in Data Destination kintone added.

    When the update data for an existing record in kintone contains a null value, you can now select the update process for that record.
    You can choose to update with NULL or****skip updating in the advanced settings of ETL Configuration STEP 1.

    2024-03-25

    ETL Configuration

    Data Destination Azure Synapse Analytics

    New Data Destination Azure Synapse Analytics.
    For more information, see Data Destination - Azure Synapse Analytics.

    image

    2024-03-18

    ETL Configuration and Managed ETL

    Temporary stage can be deleted when ETL Job to Snowflake fails.

    If an ETL Job to Snowflake fails, you can now choose to delete the temporary stage.
    For more information, see Data Destination - Snowflake > STEP1 Advanced Configuration.

    image

    UI・UX

    Maximum number of ETL Configuration Data Setting data previews changed to 20.

    The maximum number of data items displayed in the Data Preview of ETL Configuration STEP2 and ETL Configuration Details has been changed to 20.
    This change shortens the time it takes for the data preview to appear in the preview of schema data in E TL Configuration STEP 2.

    API Update

    Data Destination Facebook Custom Audience(Beta Feature)

    The version of the Facebook API used for transfer has been updated from v17 to v18.
    Please refer to the Meta for Developers documentation for the new version.

    2024-03-11

    Workflow

    Allow the error log of an ETL Job to be viewed within the execution log of a Workflow Job.

    When an ETL Job embedded in a workflow fails, the error log for the relevant ETL Job can now be viewed from the workflow execution log.
    You can check the error log by clicking the corresponding task in the workflow execution log.

    image

    API Update

    The Facebook API used for the following Connector has been updated from v17 to v18.

    • Data Source Facebook Ad Insights
    • Data Source Facebook Ad Creative
    • Data Source Facebook Lead Ads
    • Data Destination Facebook Conversions API

    Please refer to the Meta for Developers documentation for the new version.

    audit log

    Removed "Restore Past Revisions" action in ETL Configuration from audit log capture.

    Removed "Update ETL Configuration (restore past revisions of change history)" from actions eligible for audit logging.
    For more information, please refer to the Change History of the Audit Log function.

    2024-03-04

    Notice

    About TROCCO Web Activity Log Help Documentation

    Until now, the help documentation for TROCCO Web Activity Log has been available on the Confluence space.
    The help documentation has now been transferred to the TROCCO Help Center. Please refer to the TROCCO Web Activity Log in the future.
    The help documentation on the Confluence side will be closed soon.

    UI・UX

    ETL Configuration list can now be filtered by Connection Configuration.

    Connection Configuration has been added as a filter item in the ETL Configuration list.
    You can filter by the ETL Configuration in which the specified Connection Configuration is used.
    For more information, see List of ETL Configurations > Filter.

    image

    ETL Configuration

    Allow regular expressions to be used to specify the path to the file to be retrieved by Data Source SFTP.

    In ETL Configuration STEP1, the path to the file to be retrieved can now be specified with a regular expression.
    For example, if you enter .csv$ in the path regular expression, only csv files under the directory specified by the path prefix will be retrieved.

    image

    Workflow

    Manually Executed Workflow Jobs to show the user who executed them

    Previously, when a Workflow Job was manually executed, it was not indicated which user executed it.
    With this change, the email address of the user who executed the following cases is now displayed.

    • When executed by clicking the Execute button on the Workflow definition details screen
    • When executed by clicking Re-Execute from the stop position on the Workflow Job Details screen .

    image

    To display a link to the Workflow Job from which it was run

    Previously, Workflow Jobs that were executed as tasks of another Workflow Job did not indicate by which Workflow Job they were executed.
    With this change, a link to the Workflow Job from which it was run is now displayed.

    image

    2024-02-26

    ETL Configuration

    Microseconds and nanoseconds added to time units for UNIX Time conversion.

    The number of UNIX Time units that can be handled in UNIX Time conversion in ETL Configuration STEP 2 has been expanded.
    Microseconds and****nanoseconds can now be selected as the unit of UNIX Time conversion before and after conversion.
    For more information, see UNIX Time conversion.

    release-notes-2024-08-29-6-58-14

    2024-02-19

    Data Catalog

    Snowflake version of Data Catalog supports "Automatic Metadata Inheritance" and "Column Lineage".

    Until now, automatic metadata takeover andcolumn linage for Data Catalogs were supported only in the Google BigQuery version.
    With this change, the same functionality is now available in the Snowflake version of the Data Catalog.

    release-notes-2024-08-29-6-58-15

    ETL Configuration

    Changed how to set merge key for Data Destination PostgreSQL.

    The method of setting the merge key when the transfer mode is set to UPSERT (MERGE) in ETL Configuration STEP1 has been changed.
    Previously, a merge key had to be set in ETL Configuration STEP 2.
    With this change, when UPSERT (MERGE) is selected as the transfer mode in ETL Configuration STEP 1, the Merge Key setting item will appear directly below.

    release-notes-2024-08-29-6-58-16

    API Update

    Data Source Shopify

    The version of the Shopify API used for transfers has been updated from 2023-01 to 2024-01.
    Please refer to the documentation in the Shopify API reference docs for the new version.

    2024-02-13

    Managed ETL

    Add Microsoft SQL Server as Data Source

    Microsoft SQL Server can now be selected as the Data Source for Managed ETL.
    Microsoft SQL Server tables can be imported in bulk, and the associated ETL Configuration can be created centrally.
    See Managed ETL Configuration > Data Source Microsoft SQL Server for various entry fields.

    release-notes-2024-08-29-6-58-17

    ETL Configuration

    Expanded columns of master data for ads that can be retrieved from Data Source LINE Ads.

    Added small_delivery to the column of data retrieved as master data for ads.
    Master data for advertisements can be obtained when Master Data (Advertisements) is selected for the Download Typeand Advertisements is selected for the Master Data Type in STEP 1 of ETL Configuration.

    Note that to incorporate the small_delivery column in an existing ETL Configuration, you must edit the ETL Configuration and run the Automatic Data Setting.
    Select "Execute Automatic Data Setting" on the screen that appears when moving from STEP1 to STEP2 of the Edit ETL Configuration screen, and save it.
    release-notes-2024-08-29-6-58-18

    2024-02-05

    ETL Configuration

    Expanded the types of dimensions that can be specified in Data Source Criteo

    CampaignId and``Campaign can now be selected in Dimension Name in ETL Configuration STEP 1.
    Dimension name is an item that appears when statistics is selected as the report type.

    release-notes-2024-08-29-6-58-19

    API Update

    Search Ads and Data Source Yahoo! Search Ads and Data Source Yahoo! Display Ads (managed)

    The version of YahooAdsAPI used for transfer has been updated from v10 to v11.
    Please refer to the YahooAdsAPI | Developer Center documentation for information on the new version.

    Due to an API update, the old indicator has been discontinued.
    From now on, if a column containing "(old)" is specified in the column name, the new column will be automatically obtained.

    2024-01-29

    ETL Configuration

    Data Destination Snowflake supports schema tracking.

    Data Destination Snowflake now supports schema tracking.
    Schema Tracking is a function that automatically corrects the schema of the Incremental Data Transfer table and resolves the schema difference between the Data Destination and the Connector's table.
    From now on, it will no longer be necessary to manually modify the schema on the Snowflake side in the event of a difference in the above schema.

    UI・UX

    ETL Configuration list can be filtered by regular expression.

    The ETL Configuration list can now be narrowed down by regular expression.
    See Filtering ETL Configuration Names with Regular Expressions for more information on the notation of regular expressions that can be entered.

    image.png

    Time Zone Configuration values are now applied by default when creating Managed ETL Configuration.

    The time zone value specified in the Time Zone Configuration is now entered by default in the time zone value selected in STEP 1 when creating the Managed ETL Configuration.

    image.png

    API Update

    Data Destination Google Ads Conversions

    The version of Google Ads API used during transfer has been updated from v13.1 to v14.1.
    Both offline and****extended conversions have been updated.
    Please refer to the Google Ads API documentation for information on the new version.

    2024-01-22

    Managed ETL

    Enabled bulk selection and deselection of tables and filtering of table names

    Previously, only pagination units (up to 100 tables/times) could be selected.
    This change allows for batch selection and batch de-selection regardless of pagination.
    In addition, it is now possible to filter by table name.

    This change will be applied to the following screens.

    • New creation STEP2
    • List of unadded tables
    • Check Created/Dropped Tables

    release-notes-2024-08-29-6-58-22

    ETL Configuration

    Added "Change event" to the resource type of Data Source Google Ads.

    Change event (change_event) has been added to "Resource Type (Report Type)" in STEP 1 of ETL Configuration.
    You can now get a report of changes that have occurred in your account.
    For more information on change_event, please refer to the Google Ads API documentation.

    release-notes-2024-08-29-6-58-23

    Data Source ValueCommerce to get reports for advertisers.

    Previously, only affiliate sites were eligible to obtain reports.
    With this change, advertiser reports can also be retrieved.
    For more information, see Data Source - ValueCommerce.

    release-notes-2024-08-29-6-58-24

    UI・UX

    Redesigned pop-up menu in the upper right corner of the screen

    The design of the pop-up menu that appears when the user clicks on his/her own e-mail address area has been redesigned.
    In addition to being able to check the organization ID and own privileges, users can now move to various settings related to accounts and users with a single click.

    In addition, links to the following pages have been moved from the pop-up menu to the sidebar on the left side of the screen with this change.

    • GitHub access token (under external collaboration )
    • TROCCO API Key (under External Linkage )
    • Audit log output

    release-notes-2024-08-29-6-58-25

    API Update

    Data Source Google Ads / Data Destination Google Ads Conversions

    The version of Google Ads API used during transfer has been updated from v13.1 to v14.1.
    Please refer to the Google Ads API documentation for information on the new version.

    2024-01-15

    ETL Configuration

    Data Source TROCCO Web Activity Log data acquisition period can be specified.

    Data Retrieval Period can now be specified in ETL Configuration STEP1.
    TROCCO Web Activity Log data can be retrieved for any time period by specifying a start and****end date.
    For more information, see Data Source - TROCCO Web Activity Log.

    release-notes-2024-08-29-6-58-26

    Connection Configuration

    HTTP/HTTPS Connection Configuration using Client Credentials can be created.

    HTTP/HTTPS Connection Configuration, Grant Type can now be selected from Authorization Code or****Client Credentials.
    Previously, the grant type was fixed and was an authorization code, but with this release, Client Credentials can now also be selected.
    For details, please refer to the HTTP/HTTPS Connection Configuration.

    Data Catalog

    Changed specifications for importing partitioned tables

    In Google BigQuery Data Catalog, the specification to retrieve partitioned tables as catalog data has been changed.
    From now on, for partitioned tables, only the table with the latest date will be retrieved as catalog data.

    Reason for specification change

    Previously, all segments in a partitioned table were obtained as catalog data.
    Because each segment was considered a separate table in the Data Catalog, there were multiple hits for essentially the same table when searching for tables, and manual metadata entry operations such as basic metadata and user-defined metadata were difficult.
    From now on, only tables with the most recent dates will be retrieved, making tables more searchable and facilitating the operation of manual metadata entry.

    2023-12-26

    Workflow

    Added "HTTP Request" to workflow task

    Added an HTTP request to the workflow task that allows external API requests to be executed.
    By incorporating tasks that communicate with external systems, you can build a more flexible and powerful workflow.
    HTTP Request Tasks can be configured and added on the Flow Edit screen of Workflow definitions.

    release-notes-2024-08-29-6-58-27

    ETL Configuration

    TSV file input/output setting delimiter can now be entered as ɑt

    TSV file input and output settings can now be set to enter Јt as the delimiter character.
    Please use this when setting the delimiter character to a tab character in an ETL Configuration that uses a Connector that handles TSV files.
    You can set the delimiter in the Input Option and****Output Option in ETL Configuration STEP 2.

    release-notes-2024-08-29-6-58-28

    Connection Configuration

    Elasticsearch Connection Configuration allows you to choose whether or not to use SSL communication.

    You can now choose whether or not to use SSL communication in Elasticsearch Connection Configuration.
    If you need to communicate with the connecting Elasticsearch using the HTTPS method, select Enabled.

    release-notes-2024-08-29-6-58-29

    UI・UX

    Revamped UI for Programming ETL

    The UI for Data****Setting>Programming ETL in STEP 2 of ETL Configuration has been revamped.
    Programming ETL allows for flexible conversion processes.
    For more information, see Programming ETL.

    2023-12-19

    Data Mart Configuration

    Data Mart Configuration change history to allow restoration of past configurations

    The Change History tab on the Data Mart Configuration details screen now allows restoration of settings for previous changes.
    If, for example, a data mart job run after a configuration change does not produce the expected results, you can immediately revert to the settings that were in place before the change.

    From the Change History tab of the Data Mart Configuration Details screen, click on Advanced/Restore>Restore Revision Settings to display the Data Mart Configuration Edit screen.
    Saving the settings displayed will restore the settings on the specified change history.

    release-notes-2024-08-29-6-58-30

    ETL Configuration

    Connector selection screen to show "Recently used Connectors".

    When creating a new ETL Configuration, the Connector selection screen now displays the most recently used Connector at the top.
    It is now easier to find frequently used connectors among the many available connectors.

    release-notes-2024-08-29-6-58-31

    Data Source Google Drive folder ID to be displayed in link text.

    The folder ID of the Data Source Google Drive displayed on the ETL Configuration details screen is now displayed as link text.
    Clicking on the folder will take you to the corresponding folder screen on Google Drive.
    It is now easier to check what files are stored in the folder for data acquisition.

    release-notes-2024-08-29-6-58-32

    2023-12-13

    Data Mart Configuration

    Data Mart Configuration change history can now be viewed.

    The Data Mart Configuration detail screen now lists the history of past changes.
    You can also check the change differences from the previous change history.

    release-notes-2024-08-29-6-58-33

    Workflow

    Workflow Job notifications can now include information about the results of the job execution.

    In the Workflow Job Notification Setting, it is now possible to embed a variable that contains information on the results of the job execution in the notification content.
    Embed information such as thenumberofoverall workflow tasks andthe number of failed workflow tasks, allowing for flexible customization of notification content.

    release-notes-2024-08-29-6-58-34

    ETL Configuration

    UNIX Time conversion and date/time formats can be transferred to each other

    UNIX Time conversion and date/time formats can now be set in the Data Setting tab of ETL Configuration STEP 2.
    You can specify a column where date/time data is stored and transfer UNIX Time conversion and date/time format to each other.
    For more information, see UNIX Time conversion.

    release-notes-2024-08-29-6-58-35

    API Update

    Data Source Facebook Lead Ads

    The version of the Facebook API used for transfer has been updated from v16 to v17.
    See Meta for Developers for the new version.

    2023-12-05

    ETL Configuration

    Data Destination Snowflake to specify batch size

    Batch Size (MB)" can now be specified in the advanced settings of ETL Configuration STEP1.
    If an error occurs during transfer due to insufficient memory, the batch size can be adjusted to eliminate the error.
    For more information, see Data Destination - Snowflake.

    release-notes-2024-08-29-6-58-36

    Connection Configuration

    Allow Key Pair Authentication Snowflake Connection Configuration to be used in ETL Configuration

    Snowflake Connection Configuration created by Key Pair Authentication can now be used in ETL Configuration.
    This allows Data Source/Data Destination Snowflake and Data Mart Snowflake to use the same Connection Configuration.

    release-notes-2024-08-29-6-58-37

    API Update

    Workflow Task Tableau Data Extraction

    The version of the Tableau REST API used for Tableau data extraction in the workflow task has been updated from 3 . 7 to 3.21.
    See Tableau REST API Help for information on the new version.

    Other

    Korean can be selected in the language settings.

    Korean can now be set as the language displayed on the TROCCO screen.
    For background on Korean language support, please see this article.
    ASCII.jp: primeNumber to Fully Expand "TROCCO" Overseas through Partnership with Korean SaaS Company

    release-notes-2024-08-29-6-58-38

    2023-11-28

    ETL Configuration

    Data Source Yahoo! Search Ads to get reports related to ad display options.

    CAMPAIGN_ASSET" and "ADGROUP_ASSET (beta)" have been added to the types of reports retrieved by Data Source Yahoo! Search Ads.
    Each can obtain the following report data, which will be available in March 2022.

    • CAMPAIGN_ASSET: Ad Display Options Report (Campaign)
    • ADGROUP_ASSET (beta): Ad Display Option Report (Ad Group)

    rn-2023-11-27-input-yahoo-ads

    Account User

    Allow account privilege administrators to disable two-factor authentication for users in the account

    Only the account privileged administrator can now disable two-factor authentication set by each user in the account.
    For more information, see About Account Privilege Administrators.

    release-notes-2024-08-29-6-58-40

    2023-11-20

    ETL Configuration

    Data Source HTTP/HTTPS allows cursor-based paging requests

    Cursor-based paging requests are now possible when retrieving data via Data Source HTTP/HTTPS.
    When the cursor base is selected in the paging setting of ETL Configuration STEP 1, various setting items for the cursor base will be displayed.
    For more information, see Data Source - HTTP/HTTPS.

    image.png

    Data Source Google Analytics 4 to allow specifying the number of rows to retrieve in a single request.

    In the Advanced Settings of ETL Configuration STEP 1, you can now specify the number of rows to retrieve in a single request when retrieving data from Google Analytics 4.
    If OutOfMemoryError occurs during job execution, adjusting the value of this item may eliminate the error.
    For more information, see Data Destination - Google Analytics 4.

    image.png

    Workflow

    Workflow definitions can be duplicated from the Workflow Definition List screen.

    Workflow definitions can now be duplicated in the Workflow Definition List screen.
    You can duplicate any Workflow definition without going to the Workflow Definition Details screen.

    image.png

    TROCCO API

    Allow the account privileged administrator to manage all API Keys in the account

    Only the account privileged administrator can now view, edit, and delete all API Keys issued by users in the account.
    Previously, only the user who issued the API Key could manage it, but now all API Keys in an account can be centrally managed.
    For more information, see About Account Privilege Administrators.

    The API Key list screen can be accessed from the Settings icon menu at the top of the screen.

    image.png

    Other

    Email notification settings to apply line breaks in messages when they are sent.

    In the various notification settings when the notification method is set to Email, the line breaks in the text entered in the message are now applied to the outgoing email.
    *If the notification method is Slack, line breaks will be applied from before.

    2023-11-14

    ETL Configuration

    Data Destination kintone

    Automatic acquisition of field codes is now supported.
    Previously, each field code had to be entered manually in STEP 1 of ETL Configuration.
    From now on, it will be automatically retrieved in ETL Configuration STEP2 according to the specified app ID.

    In addition, when the application to be transferred contains a table (formerly a sub-table), the user can now select whether to transfer the records by dividing them by row in the table (formerly a sub-table) or by combining them into a single record.

    Please refer to Data Source - kintone for more information regarding the above.

    Data Destination Google Cloud Storage

    Naming conventions for multiple file output can now be specified.
    When parallel transfer is selected as the transfer mode, the transferred data may be split into multiple files.
    From now on, you can specify the naming of multiple files in the advanced settings of ETL Configuration STEP 1.

    Data Source Google Analytics 4

    The status of the "(other)" row when it is used can now be selected.
    You can choose whether to set the job to Succeeded or Error when the retrieved data contains "(other)" rows.
    For more information on the " ( other ) " row, please refer to the official Google Analytics documentation at [GA4]About the "(other)" row.

    Managed ETL

    Data Destination Snowflake

    UPSERT (MERGE) has been added as a Data Destination output mode.
    If there is a record in the existing table based on the merge key, the record is updated; if there is no record, the record is appended.
    For more information, see Data Destination - Snowflake.

    2023-11-06

    ETL Configuration

    Renaming of former Twitter-based Connector

    The following Connectors have been renamed

    • Data Source Twitter Ads → X Ads (Formerly Twitter Ads)
    • Data Destination Twitter Ads Web ConversionsData Destination X Ads (Formerly Twitter Ads) Web Conversions

    Other

    Revised TROCCO Terms of Use

    Revised TROCCO Terms of Use.
    TROCCO Terms of Use - available in the latest version.

    2023-10-31

    Account User

    Added the ability to delete users

    Users registered to a TROCCO account can now be deleted from TROCCO.
    In the future, users can be removed without contacting our Customer Success.
    See Deleting Users for more information, including permissions required to delete users.

    Connection Configuration

    JDBC driver version can be specified in Snowflake Connection Configuration.

    JDBC driver version can now be specified in Snowflake Connection Configuration.
    When creating a new Connection Configuration, 3.14.2 is selected as the default value.

    2023-10-24

    Managed ETL

    Add Snowflake to Data Destination

    Snowflake can now be selected as a Data Destination for Managed ETL.
    ETL Configuration, which retrieves Data Sources from the Data Source in bulk and transfers them to Snowflake, can be created and managed centrally.
    See Managed ETL Configuration for more information.

    UI・UX

    Support for drag-and-drop column reordering of ETL Configuration

    Columns can now be rearranged by drag-and-drop operation in Column Setting of STEP 2 of ETL Configuration.
    The order of columns can be rearranged with intuitive operations.

    image.gif

    ETL Configuration

    Extends the normal system judgment when acquiring Data Source HTTP/HTTPS data.

    In STEP 1 of ETL Configuration, the status code that determines a normal system when acquiring ETL Configuration data can now be specified.
    For more information, see Data Source - HTTP/HTTPS.

    TROCCO API

    The TROCCO API can now retrieve a list of ETL Jobs.
    You can specify any ETL Configuration and get a list of ETL Jobs using that ETL Configuration.
    For more information, see About the TROCCO API.

    API Update

    Data Source Yahoo! Search Ads

    The version of YahooAdsAPI used in the transfer has been updated from v9 to v10.
    For more information about the new version, please visit the YahooAdsAPI | Developer Center.

    2023-10-16

    organization ID

    Organization ID to be visible on the TROCCO screen

    The organization ID is now displayed in the pop-up menu for logged-in users.
    Organization ID is a required field when logging in.
    In the unlikely event that you forget your organization ID, please ask a user who is already logged in to confirm your organization ID from the menu above and notify us.
    image.png

    Account User

    Connection Configuration operation restrictions can now be applied on a per-user basis.

    It is now possible to prohibit individual users from creating, editing, or deleting Connection Configuration.
    By limiting the number of users who can create Connection Configurations, you can prevent connections to data sources to which they are not intended to connect, such as privately managed storage.
    This reduces the risk of unintended data leakage.
    For more information, see User Settings.

    API Update

    Data Source Google Ad Manager

    The version of the Google Ad Manager API used during transfer has been updated from v202211 to v202305.
    For more information on the new version, see Google Ad Manager API.

    2023-10-10

    Notice

    Change of login method

    On Monday, October 2, 2023, the method of logging into TROCCO was changed.
    For more information, please refer to the Change of Login Method to TROCCO.

    dbt linkage

    Allow any dbt version to be used in a dbt job

    The dbt version, previously fixed at dbt Core v1.3, can now be specified arbitrarily.
    On the New/Edit screen of the dbt Git repository, you can choose from the following versions

    • dbt Core v1.6
    • dbt Core v1.5
    • dbt Core v1.4
    • dbt Core v1.3

    UI・UX

    List of Data Mart Jobs

    Faster loading time
    The data loading process has been sped up, and the time until the list of data mart jobs is displayed has been shortened.

    Expanded filtering capabilities
    The filtering format has been changed from the traditional text entry format to a format where the name of the Data Mart Configuration is selected.
    This makes it possible to narrow the list of Data Mart Jobs by multiple Data Mart Configuration names.

    image.png

    API Update

    Data Source Facebook Ad Creative and Data Destination Facebook Offline Conversions

    The version of the Facebook API used for transfer has been updated from v16 to v17.
    See Meta for Developers for the new version.

    2023-10-02

    security

    Changed the period of time that login status is retained to 48 hours.

    For enhanced security, the retention period of login status has been changed to 48 hours.
    After 48 hours have elapsed since the last operation of TROCCO, the system enters a logout state.
    The next time you access TROCCO, you will need to log in.

    ETL Configuration

    Data Destination Facebook Custom Audience(Beta Feature)

    New Data Destination Facebook Custom Audience (Beta Feature).
    See Data Destination - Facebook Custom Audience (Beta Feature) for more information on the various input fields and column mappings.

    Added Update Processing Configuration for NULL values forwarded to Data Destination Salesforce.

    You can now select the update process when the update data for an existing record in Salesforce contains a null value.
    You can choose to update with NULL or****skip updating in the advanced settings of ETL Configuration STEP 1.

    image.png

    API Update

    Data Source Google Ads / Data Destination Google Ads Conversions

    The version of Google Ads API used in the transfer was updated from v12 to v13.1.
    Please refer to the Google Ads API documentation for information on the new version.

    2023-09-25

    Notice of Addition of Account Privilege Administrator Authority

    On September 5, 2023, a new privilege , Account Privilege Manager, was added to TROCCO.
    The Account Privilege Manager is the strongest of TROCCO's privileges and is therefore a special privilege that can only be granted to one user per account.
    Due to the nature of this authorization, we will be taking a process to allow customers to choose which user will be the account privileged administrator for accounts that existed prior to September 5, 2023.
    Please respond to the following email sent on September 14, 2023 notifying the following users about the transition.

    Earliest user registered to a TROCCO account that has not been removed from the account

    If you have any questions, such as if you do not know where to send an e-mail, please contact our Customer Success.

    ETL Configuration

    Data Destination File and Storage System Connector supports zip file compression

    Some Data Destination file/storage system Connectors now support compression of files in zip format.
    Zip can be selected for file compression in ETL Configuration STEP 1 with the following Connector as the Data Destination.

    • Data Destination Azure Blob Storage
    • Data Destination FTP/FTPS
    • Data Destination Google Cloud Storage
    • Data Destination KARTE Datahub
    • Data Destination S3
    • Data Destination SFTP

    dbt linkage

    dbt Git repository settings can now specify subdirectories

    Subdirectories in the Git Integration repository can now be specified as project directories for dbt integration.
    Previously, the directory for integration was fixed to the root directory of the Git repository.
    From now on, you can specify any directory in the Git Integration repository as the destination for dbt integration.

    image.png

    2023-09-20

    Data Mart Configuration

    Data Mart Configuration in Google BigQuery allows clustering settings.

    Clustering configuration item was added to Data Mart Configuration in Google BigQuery.
    Clustering settings can now be configured for tables newly created by executing a data mart job.

    However, if a table already exists in the output destination, the settings of the existing table will be taken over instead of the contents of this setting.

    image.png

    ETL Configuration

    Data Source Google BigQuery Job Waiting Timeout can be Job Setting.

    In the Advanced Settings of ETL Configuration STEP1, you can specify the timeout period in seconds for waiting for a Job.
    When there are many queries running in BigQuery, slot limits may cause jobs to wait until they are executed. If this waiting time reaches the timeout period, the relevant ETL Job will fail.
    In such cases, increasing the "Job Waiting Timeout (sec)" will avoid ETL Job failures.

    Data Destination Snowflake's NUMBER type output can specify precision and scale.

    You can specify the precision and scale of the NUMBER type in the Output Option tab > Column Settings > Data Type in ETL Configuration STEP 2.
    Use this function to convert data to be transferred to Snowflake to a NUMBER type of any precision and scale.

    For more information on the precision and scale of the NUMBER type, please refer to the official Snowflake documentation - NUMBER.
    ### UI・UX

    ETL Configuration list sidebar filter button fixed display

    The Filter by this content button in the Filter by area of the ETL Configuration list is now fixed so that it always appears on the screen.
    image.png

    dbt linkage

    dbt Job Setting in Google BigQuery supports selective location input.

    Locations for dbt Job Settings can now be entered from a selection.

    2023-09-12

    Managed ETL

    Schema Change Detection for ETL Configuration can now be set in bulk.

    Schema Change Detection for Managed ETL Configuration can now be set in STEP 3 in a batch.
    Receive notification of schema changes without having to configure them individually in each ETL Configuration.
    image.png

    ETL Configuration

    Expanded items for Date/Time Columns to be added in Transfer Date Column Setting.

    Transfer Date Column Setting in the Transfer Settings STEP 2 Advanced Settings > Data Setting tab now allows for flexible ETL Configuration.
    When the Transfer Date Column Data Typeis set to string, the following items can be ETL Configuration.

    • Format: A Specifies the format of the date/time expansion value.
    • Time zone Select the time zone to be expanded in the format's time zone specifier from Etc/UTC or``Asia/Tokyo.
      image.png

    Extended the data types that can be transferred from Data Source Oracle Database

    Data to be imported from Data Source Oracle Database can now be converted to string type and transferred.
    Click on Set Details in ETL Configuration STEP 1, specify the target column name, and select string for the Data Type.
    For example, numerical values that had a large number of digits and were missing data during transfer can now be converted to string type and transferred, thereby avoiding missing data.

    Data Mart Configuration

    Custom Variable embedding support for Google BigQuery partition fields.

    Custom Variable embedding is now available in the Partition field of Data Mart Configuration in Google BigQuery.
    The value of the partition field can be dynamically specified at job execution.
    The *Partition field is a setting item that can be entered when partitioning is selected by field.
    image.png

    Data Catalog

    Google BigQuery Data Catalog adds service accounts as an authentication method.

    Service Account (JSON Key)" can now be selected as the authentication method for Google BigQuery Data Catalog.
    For details, please refer to the "For First-Time Users" page.

    security

    Allowed IP Addresses to be specified in CIDR format

    Allowed IP Addresses can now be specified in CIDR format (a writing style in which the IP address and subnet mask are expressed simultaneously).
    For example, if you enter 192.0.2.0/24, access is allowed from``192.0.2.0 to``192.0.2.255.

    Click the AddAllowed IPAddress button on the Security screen to go to the Add Allowed IP Address screen.
    image.png

    API Update

    Data Source Facebook Ad Insights

    The version of the Facebook API used above has been updated to v17.
    Please refer to the Meta for Developers documentation for the new version.

    2023-09-04

    ETL Configuration

    Data Source Microsoft Advertising

    Data Source Microsoft Advertising is newly added.
    Data can now be retrieved and transferred from Microsoft Advertising reports such as keywords and campaigns.
    See Data Source - Microsoft Advertising for details on the various input fields.

    Time Zone Setting

    Time zone can now be set.
    Any time zone can be applied to date and time specifications, such as the date and time displayed on the screen or the date and time of a scheduled execution of ETL Configuration.

    For more information, see About Time Zone Settings.

    2023-08-28

    Data Mart Configuration

    Data Mart Configuration in Google BigQuery Expands Choice of Partition Types

    In Data Mart Configuration in Google BigQuery, we have added the options of monthly andyearlypartition types that can be specified in the query settings.
    When setting up partitions, there are four partition types to choose from

    • Hourly
    • per day
    • Every 1 month
    • Every 1 year

    2023-08-21

    ETL Configuration

    Add "card" to Data Source Twitter Ads to get data.

    Data Source Twitter Ads now includes card as a target for retrieval.
    This allows us to retrieve and transfer information such as the card's website URL.
    For more information, see Data Source - Twitter Ads.

    image.png

    Enable/Disable header can be selected when output file format is CSV/TSV.

    In the Output Option of ETL Configuration STEP2 , where the File/Storage Connector is the Data Destination, you can now select whether to enable or disable the CSV/TSV header for output.
    If you do not need a header line, select Disable.
    For more information, please refer to the section on output file format settings.

    image.png

    Connection Configuration

    HTTP/HTTPS Connection Configuration, Authorization URL parameter can be added.

    HTTP/HTTPS Connection Configuration, you can now add a parameter for the authorization URL.
    Click on Configure Connection Configuration Details to see the parameters of the authorization URL.
    Some services require certain parameters to be passed when obtaining authorization codes as a condition for obtaining a token. Please use this item in such cases.
    For details, please refer to the HTTP/HTTPS Connection Configuration.

    image.png

    2023-08-07

    Notice

    Change of login method

    On Monday, October 2, 2023, between 10:00 a.m. and 4:00 p.m., the method of logging into TROCCO will change.
    For more information, please refer to the Change of Login Method to TROCCO.

    ETL Configuration

    Add Data Destination HubSpot Engagement Object

    Engagement objects can now be selected in the object type in ETL Configuration STEP 1.
    Data Destination for interaction-related data.
    For more information, see Data Destination - HubSpot.

    Added Replace Empty Characters option to Data Destination Snowflake Configuration.

    In the Advanced Settings of STEP 1 of ETL Configuration, it is now possible to specify whether or not empty characters in ETL Configuration data are to be replaced with NULL.
    Uncheck the box if you want Snowflake to transfer empty characters in the data to be transferred.

    image.png

    UI・UX

    Workflow definitions for the destination are displayed on the various detail pages.

    Various detail pages, such as ETL Configuration and Data Mart Configuration, which are embedded as tasks in Workflow, now display the embedded Workflow definitions.
    You can check the details on the various pages below.

    • ETL Configuration
    • Managed ETL Configuration
    • Data Mart Configuration
    • Workflow
    • dbt Job Setting

    image.png

    Added links to various detail screens to list items in the Add Workflow Task modal

    Added links to the respective detail screens in the modal for adding various tasks on the flow edit screen.
    This allows the contents of tasks to be viewed immediately from the flow editing screen.

    • Managed ETL Configuration
    • Data Mart Configuration
    • Workflow
    • dbt Job Setting

    image.png

    Improved searchability of the Data Mart Sync Job List screen

    The Data Mart Configuration list can now be filtered by definition name in the sidebar of the Data Mart Sync Job List screen.
    You can now easily view a list of any Data Mart Job by filtering by Data Mart Configuration name.

    image.png

    API Update

    The Facebook API used for the following Connector has been updated to v16.

    • Data Source Facebook Ad Insights
    • Data Destination Facebook Offline Conversions
    • Data Source Facebook Lead Ads

    Please refer to the Meta for Developers documentation for the new version.

    2023-07-31

    Workflow

    Expanded data warehouse with data checking within workflow

    Data checking tasks for the following data warehouses can now be executed in a workflow.

    • Snowflake
    • Amazon Redshift

    The results of the query against the data warehouse are checked against the error condition, and if the condition is met, the corresponding task is set to error.

    For more information, see Workflow Data Check.

    Custom Variable Loop Execution Edit Form with Snowflake Warehouse Suggestions

    In the input field for specifying a Snowflake warehouse as the target of Custom Variable loop execution, the warehouse associated with the Connection Configuration to be used is now displayed as a suggestion.

    ETL Configuration

    Data Source Adobe Marketo Engage added an item that allows Custom Variables to be embedded.

    Custom Variables can be embedded in the following input fields.

    • Filter type when custom object is selected as target
    • Workspace when folder is selected as target

    API Update

    Facebook API updated to v16

    The Facebook API used for the following Connector has been updated from v15 to v16.

    • Data Source Facebook Ad Creative
    • Data Destination Facebook Conversions API

    Please refer to the Meta for Developers documentation for the new version.

    UI・UX

    Added ETL Configuration details link when selecting ETL Configuration for workflow.

    A link to the ETL Configuration Details screen has been added to the Add TROCCO Transfer Job Task modal in the Workflow.
    This allows the user to move to the ETL Configuration details screen when selecting an ETL Configuration, and to make a selection after reviewing the contents.

    image.png

    2023-07-24

    ETL Configuration

    OAuth 2.0 added to Data Source HTTP/HTTPS authorization method.

    OAuth 2.0 is added as an authorization method.
    This allows data to be retrieved from data sources that require OAuth authentication.
    For more information, see Data Source - HTTP/HTTPS.

    Added Data Source Zendesk Support to retrieve data.

    Added ticket_comments to be retrieved by Data Source Zendesk Support.
    It is now possible to retrieve and forward comment data related to tickets, including the text of the comment and the ID of the sender of the comment.
    For more information, see Data Source - Zendesk Support.

    Data Source Adobe Marketo Engage added an item that allows Custom Variables to be embedded.

    Custom Variables can now be embedded in the following input fields.

    • Program ID filter when selecting a program member as target
    • List ID filter when selecting leads by static list as target

    XML (beta version) support for input file format

    XML (beta version)" has been added in the ETL Configuration of the following Connector.

    • Data Source Google Drive
    • Data Source HTTP/HTTPS

    For more information, see About input file format settings.

    TROCCO API

    Added parameter to specify time zone in job execution request

    Added time_zone as a request parameter for ETL Job and Workflow Job execution.
    This allows you to specify the time zone for the date and time specified in context_time.
    For more information, see About the TROCCO API.

    2023-07-18

    API Update

    Data Source Criteo

    API version has been updated to v2023.04.
    Please refer to Version 2023.04 release notes for more information about the new version.

    ETL Configuration

    Expanded columns transferred from Data Source Facebook Ad Creative.

    Three columns have been added to the data transferred from Data Source Facebook Ad Creative.
    The following column names are displayed in the preview of ETL Configuration STEP2 and ETL Configuration Details.

    • ad_creative_object_story_spec_video_data_video_id
    • ad_creative_object_story_spec_child_attachments_image_hash
    • ad_creative_asset_feed_spec_image_hash

    Data Destination HubSpot custom object support

    Custom objects can now be selected for the object type in ETL Configuration STEP 1.
    This can be used to transfer data that cannot be categorized by HubSpot's standard objects.
    Custom objects are displayed as "xxxxxx (custom object)".
    image.png

    Connection Configuration

    Roles can now be specified in Snowflake Connection Configuration.

    Roles can now be specified in Snowflake Connection Configuration.
    Roles that grant the necessary permissions to access Snowflake from TROCCO can be tied to Connection Configuration.
    If not entered, Snowflake's default settings are used.

    Additional hosts selectable in Braze Connection Configuration

    The rest.fra-02.braze.eu has been added to the host choices in Connection Configuration.

    UI・UX

    Fixed header size of various data mart screens changes in response to scrolling

    Scrolling down in the New, Detail, and Edit pages of the Data Mart will reduce the height of the fixed header.
    The display area for the contents of Data Mart Configuration has been widened, increasing the amount of information displayed on the screen.

    Before ScrollAfter scrolling
    image.pngimage.png

    2023-07-10

    ETL Configuration

    Data Source Yahoo! Search Ads now supports CampaignExportService.

    CampaignExportService has been added to the services for data acquisition that can be selected in ETL Configuration STEP1.
    See Data Source - Yahoo! Search Ads for the fields retrieved by the CampaignExportService.

    2023-07-03

    ETL Configuration

    Data Source TROCCO enables ETL Job execution history for Data Mart and Workflow

    Data Mart and****Workflow have been added to the list of data to be transferred.
    ETL Job transfers historical data about previously executed data mart and workflow jobs.
    See Data Source - TROCCO for details.

    Data Destination Amazon Redshift allows batch size to be specified.

    The batch size can now be specified in the advanced settings of ETL Configuration STEP 1.
    If an error due to insufficient memory occurs during transfer, the batch size can be adjusted to eliminate the error.
    For more information, see Data Destination - Amazon Redshift.

    Managed ETL

    Add PostgreSQL as Data Source

    PostgreSQL can now be selected as the Data Source for Managed ETL.
    You can import PostgreSQL tables in a batch and create the associated ETL Configuration in one place.
    See Managed ETL Configuration for various entry fields.

    UI・UX

    Expanded edit area on workflow edit screen

    The add task sidebar on the workflow editing screen can now be opened and closed.
    By closing the sidebar, a larger display area can be used for workflow editing.

    Sidebar (open)Sidebar (closed)
    image.pngimage.png

    Improved visibility of data lineage on data mart detail pages

    Data lineage information on the data mart detail page is now displayed in a hierarchical structure.
    Compared to before the change, where the hierarchical structure was not represented, it is now easier to see the relationship between data sets and tables.
    image.png

    2023-06-27

    ETL Configuration

    XML (beta version)" added as input file format

    XML files can now be selected as the input file format for ETL Configuration for some Data Source file and storage-based Connectors.
    For more information, see About input file format settings.

    Data Source Zendesk Support

    Ticket_metrics has been added to the data acquisition target.
    Various indicator data about the ticket, such as the date and time it was resolved and the time of the first reply, can now be retrieved and forwarded.
    For more information, see Data Source - Zendesk Support.

    Connection Configuration

    JDBC Driver Selection for MySQL

    The JDBC driver can now be selected in the MySQL Connection Configuration.
    For details, please refer to the Data Source - RDBMS Version Mapping Table.

    2023-06-19

    ETL Configuration

    Data Destination Snowflake

    UPSERT (MERGE) has been added as a transfer mode.
    If there is a record in the existing table based on the merge key, the record is updated; if there is no record, the record is appended.
    For more information, see Data Destination - Snowflake.

    Data Source TikTok Ads

    Support for loading of advertiser IDs has been added.
    By clicking on "Load Advertiser ID, " the advertiser ID associated with the Connection Configuration you selected earlier will be suggested.
    This allows you to create ETL Configurations without having to manually enter the advertiser ID.

    Connection Configuration

    Google Analytics 4

    Google account (OAuth) has been added as an authentication method.
    This allows Connection Configuration to be created without creating a JSON Key in the service account.

    UI・UX

    Custom Variable Definition Form

    The design has been substantially redesigned.

    2023-06-12

    ETL Configuration

    Data Source Google Analytics 4

    Data Source Google Analytics 4 is newly added.
    Please refer to Data Source - Google Analytics 4 for more information on the various input fields.

    2023-05-29

    ETL Configuration

    Data Source TROCCO supports ETL Job execution history transfer

    The history of ETL Job execution can now be transferred from Data Source TROCCO.
    Historical data of ETL Jobs executed up to one year in the past can be transferred.
    See Data Source - TROCCO for details.

    Data Destination Google Sheets to allow sorting of data.

    Data Order can now be set from the Advanced Settings in ETL Configuration STEP 1.
    Records can be sorted by sort key name and sort order.
    See Data Destination - Google Sheets for more information.

    Data Source Box can now select the decompression format for data transfers.

    The decompression format can now be selected in ETL Configuration STEP1.
    If the data to be transferred from Box is compressed, it will be decompressed and transferred in the selected format.
    The decompression method can be selected from the following four options

    • bzip2
    • gzip
    • tar.gz
    • zip

    Custom Variable embedding support

    ETL Configuration and Data Mart have been expanded to include more input fields in which Custom Variables can be embedded.
    Custom Variables, which can be dynamically populated at runtime, are now available in more Connector services.

    ETL Configuration

    • Data Source Google BigQuery
      • Temporary Table Creation Destination Dataset
    • Data Destination Amazon S3
      • Bucket."
    • Data Destination Google Cloud Storage
      • Bucket."
    • Data Destination Google BigQuery
      • Dataset."
    • Data Destination FTP/FTPS
      • Path prefix."
    • Data Destination PostgreSQL
      • Database," "schema," "table."
    • Data Destination SFTP
      • Path prefix."

    datamart

    • Snowflake
      • Warehouse," "output destination database," and "output destination schema."

    2023-05-22

    ETL Configuration

    Expanded Decompression Format Options for Data Source Google Cloud Storage

    bzip2 " and " gzip" have been added to the "Extract Format" options in STEP 1 of ETL Configuration.
    When transferring compressed data from Google Cloud Storage, you can choose from the following four decompression formats

    • bzip2
    • gzip
    • tar.gz
    • zip

    Data Destination FTP/FTPS now has selectable transfer mode.

    The "Transfer Mode" can now be selected in ETL Configuration STEP 1.
    You can choose from the following two options

    Parallel transfer."
    Apply parallel processing to perform the transfer.
    Compared to "output file number suppression transfer," the transfer time is reduced.
    Due to parallel processing, a file retrieved from Data Source may be split into multiple files and sent to Data Destination.

    Output File Count Suppression Transfer
    Performs transfers without applying parallel processing.
    Unlike "Parallel Transfer," files retrieved from Data Source are sent to Data Destination without being split.

    Custom Variable embedding support

    ETL Configuration and Data Mart have been expanded to include more input fields in which Custom Variables can be embedded.
    Custom Variables, which can be dynamically populated at runtime, are now available in more Connector services.

    ETL Configuration

    • Data Source Amazon Redshift
      • Database, Schema.
    • Data Source Amazon S3
      • Bucket."
    • Data Source Azure Blob Storage
      • Path prefix."
    • Data Source Box
      • Folder ID.
    • Data Source Google Sheets
      • URL of the spreadsheet."
    • Data Source Google BigQuery
      • Temporary Table Creation Destination Dataset
    • Data Source Google Cloud Storage
      • Bucket."
    • Data Source Google Drive
      • Folder ID.
    • Data Source PostgreSQL
      • Database, Schema.
    • Data Source Snowflake
      • Warehouse," "database," and "schema."
    • Data Destination PostgreSQL
      • Database," "schema," "table."
    • Data Destination Google BigQuery
      • Dataset."
    • Data Destination Google Drive
      • Folder ID.

    datamart

    • Google BigQuery
      • Destination data set.

    2023-05-15

    ETL Configuration

    Added "Upper Case Conversion" and "Upper Snake Case Conversion" as batch conversion formats for column names

    In ETL Configuration STEP 2 Column Setting, "Upper Case Conversion" and "Upper Snake Case Conversion" have been added to the conversion format for Batch Column Name Conversion.
    image.png

    Custom Variable Support for Data Destination Google Ads Conversions

    Custom Variables are now available for the following items in the ETL Configuration of Data Destination Google Ads Conversions.

    • Customer ID
    • Conversion Action ID

    UI・UX

    Fixed header size in ETL Configuration details changes according to scrolling

    Scrolling down in the ETL Configuration details screen will reduce the height of the fixed header.
    The display area for the contents of ETL Configuration has been widened, increasing the amount of information displayed on the screen.

    Before ScrollAfter scrolling
    image.pngimage.png

    Data Catalog

    Full text display of table logical names in ER diagrams

    Full-text display was previously available only when the table name was long, but now full-text display is available on hover even when the logical name is long.

    2023-05-08

    Custom Variable support for DWH services

    **Custom Variable Loop Execution in Workflow allows for flexible loop processing when Custom Variables are set in the DWH service. **
    The items that are now supported for Custom Variable input are as follows

    ETL Configuration

    • Data Destination Snowflake
      • warehouse
      • database
      • schema
      • table
    • Data Destination Redshift
      • database
      • schema
      • table
      • Amazon S3 bucket
      • Amazon S3 key prefix
    • Data Source BigQuery
      • Data export destination Google Cloud Storage URI

    datamart

    • Redshift
      • database
      • schema
      • table

    ETL Configuration

    Page Size" can be specified in Data Source Zendesk Support

    Data Destination Zendesk Support's "Page Size" can now specify the number of items that can be retrieved in a single request.

    Improved error handling in Data Destination Salesforce

    Data Destination Salesforce forwarding can now set the forwarding status to Error when a record fails to send.

    dbt linkage

    Target" and "Location" can now be specified in dbt Job Settings

    Target" and "Location" can now be specified in dbt Job Settings.
    The "Location" will only appear in the dbt Job Setting with the BigQuery selected dbt Git repository as the adapter.

    2023-04-24

    UI・UX

    Improved usability of sidebar

    The hierarchical structure and order of items in the sidebar displayed on the left side of the screen has been changed.
    Functions with multiple pages can now be collapsed in item-by-item chunks, and items on the currently displayed page will be collapsed and expanded.

    release-notes-2024-08-29-6-58-73

    2023-04-17

    ETL Configuration

    Data Destination Yahoo!JAPAN Ads Display Ads Conversion Measurement API (Beta Feature)

    Data Destination Yahoo! JAPAN Ads Display Ads Conversion Measurement API (Beta Feature) has been newly added.
    For details on the various input fields and column mappings, please refer to Data Destination - Yahoo!

    Data Destination Snowflake

    In ETL Configuration STEP1, a list of items that can be entered in the following resources can now be read based on the specified Connection Configuration.

    • Warehouse Name
    • database
    • schema

    UI・UX

    Expanded workflow task information

    Workflow tasks now display the creator. Also, a message is now displayed if you do not have permission to view the file.

    2023-04-10

    dbt linkage

    Official Release

    The dbt integration feature, which was offered as an optional feature in the beta version, is now available as an official version.
    This allows you to use this function without having to contact your contact person.
    For more information, see About dbt Linkage.

    Location settings for BigQuery datasets to connect to

    On the Create/Edit dbt Job Setting screen, you can now enter a location when you select a dbt Git repository with BigQuery selected as the adapter.
    A destination data set is created at the location entered.

    ETL Configuration

    Data Destination Twitter Ads Web Conversions

    Data Destination Twitter Ads Web Conversion is a new addition.
    See Data Destination - Twitter Ads Web Conversions for more information on the various input fields and column mappings.

    Data Source Snowflake

    In ETL Configuration STEP1, a list of items that can be entered in the following resources can now be read based on the specified Connection Configuration.

    • Warehouse Name
    • database
    • schema

    UI・UX

    Improved visibility of workflow loop execution forms

    The layout of the workflow loop execution form has been revised to facilitate deletion and other operations.

    2023-04-03

    ETL Configuration

    Data Destination LINE Conversion API

    A new Data Destination LINE Conversion API has been added.
    For more information on the various input fields and column mappings, please refer to Data Destination - LINE Conversion API.

    Workflow

    Custom Variable loop execution supports Snowflake queries

    Custom Variable loop execution now supports Snowflake queries.
    Custom Variable expansion values in a loop run can be set based on the results of a Snowflake query.

    API Update

    Data Source Shopify

    2023-03-27

    UI・UX

    Expanded dashboard on TROCCO home screen

    • Workflow definitions are now displayed on the dashboard of TROCCO's home screen.
      image.png

    2023-03-20

    ETL Configuration

    Data Source Google Search Console

    • The data to be acquired can now be filtered by dimension filters.
      • Clicking on the Set Details in E TL Configuration STEP 1 will bring up the Dimension Filters.
      • Currently, only PAGE items are supported.
        image.png

    Data Catalog

    Logic for obtaining metadata when using team functions

    • Metadata about the data source (e.g., metadata about MySQL) is obtained by connecting to the data source using Connection Configuration on TROCCO, which is available to Data Catalog managers.
    • When the Team function is used in conjunction with the Team function, only Connection Configuration for which the Data Catalog administrator has the Operator role or higher privileges will be used.

    For more information on this matter, please refer to the specifications for obtaining catalog data and metadata.

    UI・UX

    Visualization of Connector usage

    • Connector usage is now displayed on the TROCCO home screen.
      image.png

    Improved visibility of Connector list

    • Each Connector is now more clearly displayed on the Service Selection screen when creating ETL Configuration and on the list of supported services on the Home screen.
      image.png

    API Update

    Data Destination Facebook Conversions API

    • The Facebook API used for the above Connector has been updated from v14 to v15.
    • Please refer to the Meta for Developers documentation for the new version.

    2023-03-06

    Notice

    The name of each Connector of TROCCO has been changed as follows.

    Before changeAfter change
    Aurora MySQLAmazon Aurora MySQL
    Cisco AMPCisco Secure Endpoint
    CloudWatch LogsAmazon CloudWatch Logs
    DynamoDBAmazon DynamoDB
    GitHub(GraphQL)GitHub GraphQL API
    MarketoAdobe Marketo Engage
    PardotSalesforce Marketing Cloud Account Engagement
    SQL ServerMicrosoft SQL Server
    Tableau CRMTableau CRM Analytics

    2023-02-27

    ETL Configuration

    Data Destination Google Analytics 4 Measurement Protocol

    Data Destination Google Analytics 4 Measurement Protocol is newly added.
    For more information on the various input fields and column mappings, see Data Destination - Google Analytics 4 Measurement Protocol.

    2023-02-20

    UI・UX

    tutorial dialog

    • A tutorial dialog is now available for first-time TROCCO users.
      • By following the three steps in the dialog, you will understand the basic functions of TROCCO.
      • This dialog will be hidden after the three steps of "Creating Connection Configuration," "Creating ETL Configuration," and "Executing ETL Job" are completed.

    image

    Markdown notation support for memos in each setting

    • Markdown notation is now supported for the memo fields in the following settings.
      • ETL Configuration
      • Data Mart Configuration
      • Workflow

    ETL Configuration

    Microsoft Excel file

    • Microsoft Excel files can now be transferred faster.
      • In the File/Storage Connector, when a Microsoft Excel file is selected as the input file format, you can now choose how to retrieve the values.
      • By selecting the use of cache as****the method for retrieving values, they are transferred faster than before.
    Cache usage

    If you select "Use Cache" in the column configuration, ETL Configuration will use the values at the time the Microsoft Excel Files are saved locally for the transfer.
    Therefore, functions that change the result of calculation each time, such as date/time functions (e.g., TODAY function) and random number generation functions (e.g., RAND function), are not recalculated when the transfer is executed. Please note

    Select Recalculate on Transfer if the formulas in the cells need to be recalculated when the transfer is executed.

    Data Catalog

    Metadata Import

    • For CSV files used for metadata import, template files can now be downloaded from the screen.
      • You can download a CSV file with pre-loaded header rows for your Data Catalog.
      • See Metadata Import for more information.

    Workflow

    Flow screen

    Multiple tasks can now be selected and moved together.

    Error indication during loop execution in query results

    An error message is now displayed when a query written in a loop in****a Google BigQuery query result or a loop in an Amazon Redshift query result fails to execute.

    Managed ETL

    Label Setting

    • Labels can now be installed and removed in batches, even after Managed ETL Configuration has been created.
      • Batch installation and removal can be performed from both the Advanced and Edit screens of Managed ETL Configuration.

    Connection Configuration

    Google BigQuery

    The list of projects that can be selected can now be read for the project ID to be specified when a Google account (OAuth) is selected as the authentication method.

    API Update

    Data Source Google Ads

    The version of Google Ads API has been updated from v11 to v12.
    Please refer to the Google Ads API documentation for information on the new version.

    2023-02-13

    Managed ETL

    Data Source Salesforce added

    • Data Source has been added to Salesforce.
      • ETL Configuration can be created for all objects connected to Connection Configuration at once.

    2023-02-06

    Notice

    Data Source LINE Ads

    End of providing connectors using scraping

    As of 00:00, 02/01/2023, Data Source Using Scraping - LINE Ads (Discontinued) is no longer offered.
    Thereafter, executing an ETL Job using the Data Source - LINE Ads (to be discontinued) Job Setting will result in an error.

    If you wish to create a new ETL Configuration in the future, please use the LINE Ads API-based ETL Configuration - Data Source - LINE Ads.

    Workflow

    Google BigQuery data check

    • Job IDs are now displayed in the execution log.
      image

    2023-01-23

    Data Catalog

    • The Data Catalog feature of the Snowflake version is now available.

    Workflow

    • Workflow definitions can now be duplicated.
      • You can duplicate a Workflow definition from the hamburger menu in the upper right corner of the Workflow definition details screen.
        image.png

    API Update

    Data Destination Facebook Offline Conversions

    • The Facebook API used for the above Data Destination has been updated from v14 to v15.
    • Please refer to the Meta for Developers documentation for the new version.

    2023-01-16

    ETL Configuration

    Data Source Google Ads

    • The following resource types (report types) have been added
      • ad_group_asset
      • customer_asset

    dbt linkage

    • The supported version has been updated from v1.2 to v1.3.

    Data Catalog

    • A link to the column reference list has been added to the table information screen.
      • Clicking on the link will display a list of Column Setting references defined in the table in question.
      • For more information on column references, see Column References.

    2023-01-10

    API Update

    Data Source Facebook Ad Insights

    • The Facebook API used for the above Data Source has been updated from v14 to v15.
    • Please refer to the Meta for Developers documentation for the new version.

    Connection Configuration

    PostgreSQL

    • You can now select the version of the JDBC driver that connects to the PostgreSQL server.
      • Please select the driver version according to the version of PostgreSQL you are using.
        • PostgreSQL 8.2 or higher: JDBC Driver 42.5.1
        • Less than PostgreSQL 8.2: 9.4 1205 JDBC 41

    2022-12-26

    API Update

    Forwarded Facebook Lead Ads and Forwarded Data Source Facebook Ad Creative

    • The Facebook API used for the above Data Source has been updated from v14 to v15.
    • Please refer to the Meta for Developers documentation for the new version.

    Workflow

    • The execution log for each task can now be easily displayed on the Workflow Job detail screen.
      • By clicking on a task on the flow, the execution log of the task is displayed.

    2022-12-19

    Notice

    Data Source LINE Ads

    Data extraction using LINE Ads API

    Recently, LINE Ads data extraction method via API was released by LINE Corporation.
    In line with the above, TROCCO has separately started providing a new connector that uses the LINE Ads API.

    For more information, see Data Source - LINE Ads.

    Other UI/UX

    Notification settings

    • Notifications regarding the following jobs are now sent even if the job was skipped.
      • ETL Job
      • Data Mart Job
      • Workflow Job

    Managed ETL

    • It is now easy to see which workflows are using Managed ETL Configuration.
      • On the Managed ETL Configuration details page, under Workflows that use this configuration, a list of workflows that incorporate the relevant Managed ETL Configuration will be displayed.

    2022-12-12

    Connection Configuration

    Amazon S3

    • Connections to Amazon S3 using authentication by IAM roles are now supported.
      • You can connect to Amazon S3 without having to place AWS access keys and AWS secret keys on TROCCO.
      • In Amazon S3 Connection Configuration, you can select the IAM role as the AWS authentication method.

    ETL Configuration

    Data Source AppsFlyer

    • Data Source AppsFlyer is newly added.
      • Please refer to Data Source - AppsFlyer for details on the report types, fields, and various restrictions that can be obtained.

    Data Destination HubSpot

    • Support for renewal of contact subscriptions.
      • Subscription can be selected as the object type.
      • See Renewing Subscriptions for more information, including the schema for transfer data.

    Other UI/UX

    Team Functions

    • Support for displaying "dbt Job Settings" and "dbt Repository" on the Resource Group detail screen has been added.

    2022-12-05

    ETL Configuration

    Data Destination Box

    • A new Data Destination Box has been added.
      • You can transfer files to Box by specifying the path of the folder.

    Workflow

    • Child workflows embedded in a workflow can now be looped.
      • Loop execution can be set up by adding a workflow as a task on the flow edit screen and then editing the task.

    Other UI/UX

    ETL Configuration List

    • In the ETL Configuration list, you can now delete a Schedule from the selected ETL Configuration in a batch.

    Data Catalog

    • The width of the query editor in the Data Catalog can now be changed.
      • You can change the width of the sidebar to any width you like by dragging the sidebar border and moving it left or right.

    2022-11-28

    Connection Configuration

    MongoDB

    • For a MongoDB replica set, you can now select the node (member) you want to read.
      • In the "Read Settings" section, you can select from the following five items.
        • primary
        • primaryPreferred
        • secondary
        • secondaryPreferred
        • nearest
      • Please refer to the help documentation for more information on loading settings.

    ETL Configuration

    Data Source TikTok Ads

    • Dimensions to be acquired have been expanded. The dimensions supported in this project are as follows.
      • Basic data metrics
        • reach
        • frequency
        • result
      • Video play metrics
        • video_watched_2s
        • video_watched_6s
        • average_video_play
      • Page Event Metrics
        • web_event_add_to_cart
        • on_web_order
        • initiate_checkout
        • add_billing
        • page_event_search

    2022-11-21

    API Update

    Twitter Ads

    • API version has been updated from v11 to v12.
    • For more information about the new version, please refer to the Twitter Developers documentation.

    Facebook Offline Conversions

    • Data Destination Facebook The Facebook API used for Offline Conversions has been updated from v14 to v15.
    • Please refer to the Meta for Developers documentation for the new version.

    ETL Configuration

    Data Destination HubSpot

    • Data Destination has been expanded to include more object types. The object types that have been added are as follows
      • company
      • deal
      • product
      • ticket
      • line_item
      • quote
    • An append (INSERT) has been added to the transfer mode.
    • For UPSERT in transfer mode, the UPSERT key can now be specified.

    Data Destination and Data Source Google BigQuery

    • Under "Dataset Locations," the following three new locations can now be selected
      • europe-west8 (Milan)
      • europe-west9 (Paris)
      • europe-southwest1 (Madrid)

    Data Destination Snowflake

    • Columns of type json are now included as VARIANT by default.
      • Until now, columns of type json have been imported as VARCHAR.
      • If you want to import json type columns as VARCHAR type as before, please make the following settings.
        • STEP2・Select VARCHAR as the Data Type in "Column Setting" in Output Option.

    Other UI/UX

    ETL Configuration List

    • Schedules can now be added to selected ETL Configurations in the list of ETL Configurations in a batch.

    Data Catalog

    • The width of the sidebar in the Data Catalog can now be changed.
      • You can change the width of the sidebar to any width you like by dragging the sidebar border and moving it left or right.

    Team Functions

    • In the Connection Configuration list on the Resource Group details screen, the service name of Connection Configuration is now displayed in the form of a link to each Connection Configuration.

    2022-11-14

    datamart

    • DATAMART - Snowflake now supports output to databases "with hyphens in the name".
      • When Data Destination mode is selected in Query Execution mode, the output database can now specify "Hyphen in Name" databases.

    2022-11-07

    API Update

    Advertise on Yahoo!

    API version has been updated fromv8to v9.
    For more information about the new version, please refer to the API Reference.

    ETL Configuration

    Data Source Google Drive

    • Files located on a shared drive can also be transferred.

    Data Destination Google Drive

    • You can now also transfer to a folder on a shared drive.

    dbt linkage

    • dbt run-operation has been added as a command to run Job Setting.
      • Macros can be called by entering a macro name.

    2022-10-24

    workflow

    • Loop execution of jobs on a workflow can now be based on Amazon Redshift query results.
      • Custom Variable expansion values in loop runs can be set based on Amazon Redshift query results.
      • By storing the values of the tables you want to expand in Amazon Redshift tables, you can define a Workflow in which the expansion values fluctuate with each execution.

    Data Catalog

    • Basic metadata values can now be imported using CSV files.
      • You can import metadata from "Metadata Import" in the Data Catalog Setting.
      • See Metadata Import for more information.

    2022-10-17

    Connection Configuration

    • Connections to Oracle Autonomous Database are now supported.
      • Wallet files can now be uploaded when " Use tnsnames.ora file" is selected under "Connection Method" in the Oracle Database Connection Configuration.
      • Uploading the wallet file will enable the connection to the Oracle Autonomous Database.

    Data Catalog

    • For summary statistics, minimum and maximum values for date and time types are now displayed.
      • Summary statistics are available in column information and previews.

    2022-10-11

    ETL Configuration

    • Data Destination BigQuery's STEP2 and Output Option now supports splitting tables by finer time units.
      • Prior to the change, only date-based division was supported.
      • With this change, you can now choose from the following four units of table division.
        • Hourly
        • per day
        • Every 1 month
        • Every 1 year

    2022-10-03

    Connection Configuration

    • PostgreSQL Connection Configuration now supports connections via AWS Systems Manager Session Manager.

    ETL Configuration

    • The File/Storage Connector now supports Microsoft Excel files (xlsx and xls) as input file formats.
      The supported Connectors are as follows
      • Data Source - Amazon S3
      • Data Source - Azure Blob Storage
      • Data Source - Box
      • Data Source - FTP/FTPS
      • Data Source - Google Cloud Storage
      • Data Source - Google Drive
      • Data Source - HTTP(S)
      • Data Source - SFTP
      • Data Source - Local Files

    workflow

    • For automatic retries in the event of workflow failure, the time interval between the next retry execution can now be specified.
      • When the number of retries is specified as 1 or more on the workflow setup screen, a time can be specified.

    Data Catalog

    • For columns generated using TROCCO's DataMart function, metadata about the data source of the source column can now be automatically inherited.
      However, the following conditions must be met in order for metadata to be taken over automatically.
      • No processing is applied to the column values on the Data Mart Configuration.
      • Data Transfer Mode is used as the query execution mode for Data Mart Configuration.

    2022-09-20

    API Update

    Google Ads

    Managed ETL Configuration

    • The edit screen for Managed ETL Configuration has been expanded.
      • Settings related to Data Source and Data Destination of registered ETL Configurations and common settings can now be changed in a batch.

    Other UI/UX

    ETL Configuration List

    • ETL Configuration can now be narrowed down by the presence or absence of Notification Settings.
    • ETL Configuration can now be narrowed down by whether or not Schedule is available.
    • Labels attached to selected ETL Configurations can now be removed in a batch.

    2022-09-12

    API Update

    Advertise on Yahoo!

    • API version has been updated to v8.
      • Due to an API update, the old indicator has been discontinued.
      • From now on, if a column containing "(old)" is specified in the column name, the new column will be automatically obtained.
    • For more information on the new version, please refer to the YahooAdsAPI v8 System Release Completion Announcement.

    Criteo

    Data Catalog

    • A button to open the query editor has been added to the table information tab.
      • Click the button to launch the query editor.
      • The query is displayed with the table opened in the Table Info tab specified in the FROM clause.
    • You can now re-authenticate your Google account in Data Catalog Setting.

    2022-09-05

    ETL Configuration

    • TROCCO has been added as a Data Source.
      • Data related to "User-Defined Metadata" and "Column Setting Reference List" held in TROCCO's Data Catalog can be transferred.
    • Data Destination Marketo can now specify a static list ID.

    Managed ETL Configuration

    • Slack notifications of table Created/Dropped Tables now include the name of the detected table.
      • Previously, the number of tables added and deleted was displayed.
      • From now on, in addition to the number of tables added and deleted, the name of the table will also be displayed.

    dbt linkage

    • Custom Variables are now available in the option values of the Run command in Job Setting.

    Data Catalog

    • User-defined metadata values can now be edited in the Markdown editor.
      • If you want to edit field values using a Markdown editor, specify "Text(Markdown)" in the "Template for user-defined metadata" data type.
    • Logical names are now displayed directly below column names in the preview under "Table Information.

    Other UI/UX

    • You can now check the authentication method for each user on the Account/User screen.
      • You can check if 2-step verification and SAML verification are enabled respectively.

    2022-08-29

    Notice

    • HubSpot API keys will no longer be available as an authentication method to access the HubSpot API after November 30, 2022.
      • Please use OAuth 2.0 as your authentication method in the future.
      • For more information, please refer to the official documentation.

    ETL Configuration

    • Custom Variables can now be used as Filter Settings in ETL Configuration STEP2 "Data Preview and Advanced Settings".
    • Custom Variables can now be used as HTTP header key/value in Data Source HTTP(S).

    Data Catalog

    • The Markdown editor is now available for editing the basic metadata item "Description" field.
    • Query Editor can now download query execution results in CSV format.
      • The maximum amount of data that can be downloaded is 10 MB, and lines that exceed the size limit will be truncated.

    Managed ETL Configuration

    More details on Managed ETL Configuration will be presented in the August 2022 Release Notes.

    Other UI/UX

    • In the ETL Configuration list, "Author" has been added to the filter field.
      • The list can be narrowed down to ETL Configurations created by the selected user.
    • In the ETL Configuration list, the filter item "Name (partial match)" now supports AND search.
      • AND searches can be performed by entering entries separated by spaces.
    • Labels can now be added to selected ETL Configurations in the list of ETL Configurations in a batch.

    2022-08-22

    dbt linkage

    • Snowflake and Redshift have been added to the dbt Git repository adapters.

    Data Catalog

    • TROCCO basic metadata settings have been added.
      • The basic metadata items "logical name" and "description" defined on the TROCCO side can now be selected whether or not to be displayed in the column list.

    2022-08-08

    ETL Configuration

    • Data Source Marketo target can now select a folder.
    • HubSpot has been added as a Data Destination.
      • Contact object is supported.
    • When line_item is selected as the report type in Data Source Twitter Ads, campaign_id is now included in the transfer content.

    dbt linkage

    • dbt integration functionality has been released.
      • We provide a runtime environment for the OSS version of dbt and GitHub integration.
      • The ETL to data modeling flow can be defined on the TROCCO Workflow function.
      • Please refer to the press release for details, including future release plans.

    2022-08-01

    ETL Configuration

    • Data Source Marketo targets can now select program members.
    • Data Destination BigQuery's STEP2, Output Option, now allows the selection of NUMERIC type as the Data Type for Column Settings.

    2022-07-25

    workflow

    • Automatic retry is now available when a workflow fails.
      • The number of times an automatic retry is performed can be specified on the workflow setup screen.

    2022-07-19

    ETL Configuration

    • Google Drive has been added as a Data Destination.
    • Updated the regions that can be specified in Data Source and Data Destination when BigQuery is selected in Data Source and Data Destination.

    datamart

    • When BigQuery is selected in Data Mart Configuration, the regions that can be specified in Data Processing Location have been updated.

    Data Catalog

    • TROCCO original metadata can now be set.
      • In addition to user-defined metadata, TROCCO has a frame for setting default metadata on the TROCCO side.

    Other UI/UX

    • All screens for ETL Configuration, Data Mart Configuration, Workflow Configuration, and various account settings are now available in English.

    2022-07-11

    ETL Configuration

    • Data Source Shopify can now specify a transaction object as the target.
      • Data stored in a transaction object can now be retrieved.
    • In Data Source Marketo, when a lead by static list is specified as the target, the list ID to be retrieved can now be specified.

    workflow

    • If a nested workflow fails in midstream, it is now reexecuted from the stop position of the child workflow when it is reexecuted.

    Data Catalog

    • Templates for User-Defined Metadata now allow multiple fields to be selected for display in the Column Setting list.
      • Data Catalog column information will be displayed in the same order as set in the Edit Template screen for User-Defined Metadata.
    • Primary table metadata can now be inherited for BigQuery tables created in TROCCO's data mart.
      • Column-based linage can now be displayed between tables with inherited metadata.

    2022-07-04

    ETL Configuration

    • When using CDC at the Data Source, up to 5 retries are now performed in a batch.
      • If an error occurs during the binlog transfer phase, it will automatically retry up to 5 times.

    Data Catalog

    • Suggestions are now displayed in the ON clause in the query editor.
      • Suggested conditions are displayed based on the dependencies registered in the Data Catalog.

    2022-06-27

    workflow

    • TROCCO batch registration has been added to the workflow task.
      • ETL Configuration associated with a batch registration can be performed in a batch.

    2022-06-20

    ETL Configuration

    • Data Source Shopify can now specify Metafield as the target.
      • Data stored in Metafield resources can now be retrieved.

    Data Catalog

    • Tab information is now maintained in the URL on the Data Catalog screen.
      • When transitioning between tabs and returning to the original screen with a browser back, it is now possible to go back to the point before the transition.

    2022-06-06

    ETL Configuration

    • In Data Source Google Ad Manager, it is now possible to specify whether to display the topmost ad unit or all units.
      • It is now possible to obtain deep hierarchical reports by child and grandchild ads.

    datamart

    • In Data Mart Configuration, Redshift has added a free description mode.
      • You can issue any query, including DDL statements, to the Redshift DWH.

    2022-05-30

    datamart

    • In Data Mart Configuration, Snowflake has added a free description mode.
      • You can issue any query, including DDL statements, to Snowflake's DWH.

    2022-05-23

    ETL Configuration

    • Added extended conversions to Data Destination Google Ads Conversions.
      • Until now, TROCCO only supported offline conversions as Data Destination Google Ads Conversions, but now it also supports extended conversions.
      • Accordingly, "Data Destination Google Ads Offline Conversions" has been renamed "Data Destination Google Ads Conversions".
    • Data Source Local Files can now specify the character encoding in STEP 1.
    • The Report Template feature is now available in Data Source Facebook Ad Insights.
      • In STEP 1 of the Edit ETL Configuration screen of Data Source Facebook Ad Insights, you can now get the main fields for each report type in the template.

    datamart

    • Data Mart Configuration in BigQuery can now specify data processing locations when using Free Description mode.
      • Previously, the query was only for the US region, but now you can specify the region of the target dataset.

    Other UI/UX

    • Up to 50 definitions can now be displayed on the Data Mart Configuration List and Workflow Definition List screens.

    2022-05-16

    ETL Configuration

    • Data Source Shopify now supports custom apps.

      • Custom apps can now be selected for Connection Configuration in Shopify as well.
    • Zoho CRM has been added as a Data Destination.

    • The ETL Configuration List screen can now display up to 50 ETL Configurations.

    Data Catalog

    • The amount of BigQuery scans can now be displayed in QueryDita.
      • At the same time, issuing an invalid query now returns an error location.

    2022-05-02

    ETL Configuration

    • Data Source AmazonS3 can now select tar.gz format for decompression configuration.
      • When extracting in tar.gz format, the relative path after decompression can now be set using regular expressions.

    Data Catalog

    • When updating user-defined metadata, indexes are now updated at the same time.
      • User-defined metadata now hits the search as soon as it is updated.

    2022-04-25

    workflow

    • Separate icons for each workflow task type.
      • It is now easier to intuitively understand what the tasks on the flow represent.

    Data Catalog

    • In Query Editor, the project selected at the last startup can now be used as is.
      • It is no longer necessary to select the project that is primarily used each time the query editor is started.
    • Table and column dependencies can now be displayed and deleted.
    • PostgreSQL has been added as a metadata collection target.

    2022-04-18

    ETL Configuration

    • Rtoaster insight+ with Google Account integration has been added as a Data Source.
      • Rtoaster insight+ with Google Account integration can now be specified for data transfer.
      • If you wish to use Data Source with Rtoaster insight+ with Google Account integration, you must add your Google account to Rtoaster insight+.
        For information on how to add to Rtoaster insight+, please see this help page .

    Data Catalog

    • Registered user-defined metadata can now be searched from the search bar at the top of the screen.
      • If user-defined metadata contains fields of type string, you can search for values within those fields.
        (e.g., if you had stored a value of "sales" in the table's "logical name" metadata, typing "sales" in the search bar would now appear in the search results).
    • When searching, you can now move to the "Search Results" screen to check detailed information and filter and sort.

    2022-04-11

    ETL Configuration

    • ZohoCRM has been added as a Data Source.
      • Data can now be extracted and transferred using SQL-like queries (COQL) specifically for ZohoCRM.
    • Added language to deprecate the trigger job function.
      • Please move to a Workflow function that can be operated via GUI and allows more flexibility in defining dependencies.

    Data Catalog

    • Query Editor can now save and load saved queries.

    2022-04-04

    Connection Configuration

    • SSH connection is now supported in the Oracle Database Connection Configuration.
    • OAuth2.0 can now be selected as the authentication method in HubSpot Connection Configuration.

    ETL Configuration

    • In Data Source HubSpot, the items "Pipeline", "Pipeline Stage", and "Contact Person" can now be retrieved.
      • Now we can get a list of "DEAL" and "TICKET" pipelines and their pipeline stages.

    security

    • When previewing data in ETL Configuration, etc., you can now choose to save (persist) the preview contents within the TROCCO service.
      • When this feature is enabled, preview contents are automatically deleted after a certain period of time after setup is complete, making the use of the system more secure.

    2022-03-28

    ETL Configuration

    • In Data Destination kintone, "update" and "upsert" have been added to the transfer mode.
      • More flexible data integration with kintone is now possible.
    • The items "subCampaignId" and "subCampaignName" can now be obtained at Data Source RTB House.
    • Deleted and archived records can now be retrieved in Data Source Salesforce.
      • Records of Task and Event objects created (archived) more than one year ago can now also be retrieved.

    Data Catalog

    • Sharding tables under any dataset are now displayed together.
    • The complement of the query editor has been enhanced to make it easier to utilize.
      • Metadata about the table you entered and the use of the function you entered are now suggested.

    2022-03-21

    datamart

    • In Snowflake, the error message when Connection Configuration is incorrectly specified is now easier to understand.

    Other UI/UX

    • FTP" listed as a data source has been changed to "FTP/FTPS" and "HTTP" has been changed to "HTTP(S)".
      • The function itself remains unchanged. The UI notation will be adapted to the function.

    2022-03-14

    ETL Configuration

    • The ad account ID is now suggested in the Data Source Facebook Ad Creative.
    • In Data Destination Google Sheets, the "Full Data Transfer (inherit existing sheets)" option has been added to the Transfer Mode.
      • Since the transfer contents are directly reflected in the target sheet, references are no longer broken in Google Data Portal, etc.
    • Data Destination Snowflake no longer leaves a stage on Snowflake after a successful ETL Job.

    Data Catalog

    • kintone has been added as a metadata collection target.

    Was this article helpful?