Error Messages
  • 17 Jul 2024
  • PDF

Error Messages

  • PDF

Article summary

This page explains the causes and solutions for error messages displayed by TROCCO.

Error messages addressed in this article

This page introduces error messages that appear in the log below.
* Preview error log
* Execution error log
* Execution log

Destination BigQuery

Errors that can occur when connecting to BigQuery

PartialExecutionException:. SocketException: java.net. Connection or outbound has closed

Possible Causes

This is a rare error that occurs when TROCCO connects to the destination BigQuery, and is caused by a network breakdown.
In most cases, the error can be resolved by rerunning the job.

cope

Errors can be avoided by adding retry settings to the transfer settings or workflow definition.
* For forwarding settings:
* Set the maximum number of retries to 1 or more in the Job Settings tab of Transfer Settings STEP 2.
* For workflow definitions:
* Set the number of retries for job execution settings to 1 or more.

Error when submitting JSON nested columns to BigQuery

Field <column name> is type RECORD but has no schema.

Possible Causes

This error occurs when a column defined as a JSON type in TROCCO is transferred as a RECORD type.
Specifically, errors occur in the following situations
* In the column definition in the Data Settings tab of Transfer Settings STEP2, set the column that is set as json as the data type.
* In the column setting of the output option tab in STEP 2 of the transfer setting, set RECORD as the data type and try to transfer the data.

cope

This can be handled by creating a table on BigQuery that is treated as a template in advance.
On the BigQuery side, create a table with the same schema as the table you want to transfer this time under the dataset you plan to transfer to.
In TROCCO, enter the table name created in the previous step in the table that refers to the schema information as a template in the Output Options tab of STEP 2 of the Transfer Settings.

Forwarding destination kintone

Error when transferring a column of a data type not supported by the destination table

Caused by:. KintoneApiRuntimeException: com.kintone.client.exception. HTTP error status 400, {"code":: {"code": 400"CB_IJ01", "id":"<ID>", "message":"Illegal JSON string. "}

Possible Causes

This error occurs when the data type of a column defined in TROCCO does not correspond to the data type defined in the destination kintone table.
Note that the "code": included in the error message above."CB_IJ01"is output when the value (JSON formed by TROCCO) is incorrect for the destination table.

cope

Please set a different data type in the column settings in the Output Options tab of Transfer Settings STEP 2.

Transfer source BigQuery

Error when referencing Google Spreadsheets as an external table

bigquery job failed:. Access Denied:. BigQuery BigQuery:. Permission denied while getting Drive credential.

Possible Causes

This error occurs when the BigQuery table from which data is retrieved references Google Spreadsheets as an external table.
Currently, TROCCO's transfer source BigQuery does not support the transfer of tables that reference Google Spreadsheets as an external table.

cope

On BigQuery, create a table that does not reference an external table and set that table as the table from which data is retrieved.

Source MySQL

Error when connection to MySQL cannot be established

Error: "Error. RuntimeException: java.lang. CommunicationsException: com.mysql.jdbc.exceptions.jdbc4. Communications link failure

Possible Causes

This error occurs when a connection with MySQL cannot be established.
Specifically, the following cases may be considered
* The storage capacity on the MySQL side is FULL, and the system is repeatedly restarting.
* No response from the MySQL side within the timeout setting value.
* This phenomenon may occur when changes are made to MySQL, such as version upgrades.

cope

  • First, check the startup status of the MySQL side.
  • If there is no problem with the MySQL startup status, do one of the following
    • In the Input Options tab of STEP 2 of TROCCO's Transfer Settings, extend the value of the socket timeout.
    • Extend the value of net_read_timeout of MySQL.

Error when unable to establish SSH/SSM connection

An error occurred during SSH connection. Please check your settings. Net::SSH::Proxy::ConnectError:. Failed to connect to your bastion host. Please check your SSM configuration.

Possible Causes

This error is caused by the environment of MySQL, SSH connection and SSM connection.

cope

Please take one of the following actions
* Check the number of simultaneous sshd connections, logs ( /var/log/secure ), etc., and you may find the problematic area.
Re-run the job after raising the limit on the number of simultaneous connections for sshd.
* Please take some time to re-run the program, as it may be due to a temporary environmental problem.
If the problem still occurs frequently, please add retry settings in "Job Settings" in STEP 2 of TROCCO's Transfer Settings.

Transfer source Redshift

Error when source data exceeds 1000 rows

Fetch size 10000 exceeds the limit of 1000 for a single node configuration. Reduce the client fetch/cache size or upgrade to a multi node installation

Possible Causes

Due to the restriction in Redshift that fetch size cannot be larger than 1000 rows, an error will occur if the data fetched by TROCCO exceeds 1000 rows.
The fetch size must be set to 1000 lines or less.

cope

In the Input Options tab of STEP 2 of TROCCO's Transfer Settings, change the number of records the cursor processes at one time to 1000 or less.

Transfer source file storage system (S3, GCS, etc.)

Refer to the file/storage system connector for connectors that may display this error message.

Error when a column of numeric or date/time type contains an unsupported character

org.embulk.spi.DataException:. Invalid record at <line number>.
Caused by:. org.embulk.standards.CsvParserPlugin$CsvRecordValidateException:. NumberFormatException:. For input string:. ""

Possible Causes

This error occurs when a column inferred as a numeric or date/time type contains unsupported characters (null, empty, etc.).
In TROCCO, according to Embulk specifications, an error occurs if columns of numeric and date/time types contain unsupported characters.

More specifically, this occurs in the following cases
* The data type of the column is inferred as numeric or date/time type in the automatic data setup executed at the STEP2 transition of TROCCO's transfer setup.
* Columns whose data type is inferred as numeric or date/time type contain unsupported characters.

cope

There are two possible ways to handle this.

Convert to string type and transfer

  1. Set the data type of the appropriate column to "string" in the column definition in the Data Settings tab of Transfer Settings STEP 2.
  2. Click Preview Changes.

Replace unsupported characters with any number and transfer

*Depending on the destination, the following actions may not be taken because column settings on the Output Options tab are not available.

  1. Convert the aforementioned non-target characters to arbitrary numbers in the regular expression replacement in the Data Settings tab of Transfer Settings STEP2.
  2. Set the data type of the relevant column to a numeric type (e.g., INTEGER type) in the column settings of the Output Options tab in STEP 2 of the transfer settings.

Transfer source Salesforce

Error due to insufficient settings on the Salesforce side

Setup::Error::ConfigError:. (INVALID_LOGIN) INVALID_LOGIN:. Invalid username, password, security token; or user locked out.

Possible Causes

This error occurs when you cannot login to Salesforce from TROCCO.
Specifically, the following cases may be considered

  • IP restrictions are in place on the Salesforce side and TROCCO's IP address is not allowed.
  • Salesforce login URL is restricted.
  • The Salesforce account associated with the connection information is not authorized to use the API.

cope

Error when a non-numeric type column contains a non-numeric value

cannot cast String to long:. "-"NumberFormatException:. For input string:. "-"

Possible Causes

This error occurs when a numeric type column contains non-numeric characters.
In the error message above, the error is caused by a symbol (hyphen) in a numeric type column.

cope

There are two possible ways to handle this.

Convert to string type and transfer

  1. Set the data type of the appropriate column to "string" in the column definition in the Data Settings tab of Transfer Settings STEP 2.
  2. Click Preview Changes.

Replace unsupported characters with any number and transfer

*Depending on the destination, the following actions may not be taken because column settings on the Output Options tab are not available.

  1. Convert the aforementioned non-target characters to arbitrary numbers in the regular expression replacement in the Data Settings tab of Transfer Settings STEP2.
  2. Set the data type of the relevant column to a numeric type (e.g., INTEGER type) in the column settings of the Output Options tab in STEP 2 of the transfer settings.

Transfer source kintone

Error when token is not authorized

Application ID acquisition error
An unexpected error has occurred. Please contact TROCCO's support team.

Possible Causes

This message is displayed when the application ID cannot be obtained in STEP 1 of the transfer setup.
This error occurs when the token used for connection information is not granted either "Record View Authority" or "Record Add Authority".

cope

Please grant "Record View Authority" and "Record Add Authority" on the kintone side to the token used for connection information.
Check the official documentation for details on permissions.

Transfer source Google Spreadsheets

Error when specifying a sheet name that does not exist in the spreadsheet

Error: "Error. (ClientError) badRequest:. Unable to parse range: <sheet name>

Possible Causes

This error occurs when the sheet name set in Transfer Settings STEP1 is different from the sheet name on the spreadsheet specified in the sheet URL.

cope

Open the file specified in the sheet URL and verify the following
- Does the sheet name exist?
- Are there any unintended spaces or other typographical errors in the sheet names?
image.png

Forwarded by Facebook Ad Insights

Error due to processing termination in Facebook API

async was aborted because the number of retries exceeded the limit

Possible Causes

This error occurs when a request to the Facebook API takes too long (more than about 15 minutes) and the Facebook API automatically terminates the process.
This error may be caused by too much acquired data.

cope

Narrow the range of the data acquisition period in STEP 1 of TROCCO's transfer settings.

Forwarder X Ads (formerly Twitter Ads)

Error when the token is not authorized to use the API

{"errors": [{"code":"UNAUTHORIZED_CLIENT_APPLICATION","message":"The client application making this request does not have access to Twitter Ads API"}], "request":{"params":{}}

Possible Causes

The following are possible causes
* Twitter Ads API usage application has not been approved.
* The token generated before the API usage application is approved is registered in the connection information.

cope

Please confirm that your application to use the Twitter Ads API has been approved.
If you have generated a token before being approved, please apply for Twitter Ads API usage and generate a token in the following order.

  1. Apply to use the API
  2. Apply to use the Ads API
  3. Generate tokens

Forwarding Source Line Ads

Error when the date format of the data acquisition period is incorrect

code:. 400. {"errors": [{"reason":"INVALID_VALUE","message":"the value is invalid","property":"since"}]}

Possible Causes

This error occurs when the date format of the data acquisition period is incorrectly specified when a performance report is selected as the download type in STEP 1 of the transfer setup.

cope

Specify the date format of the data acquisition period in the format %Y-%m-%d (YYYY-MM-DD). (Example: 2023-02-01)
If custom variables are used, the date format of the custom variables should also be specified in the format %Y-%m-%d.

Forwarding source App Store Connect API

Error when the private key in the connection information is incorrect

Error: "Error. RetryGiveupException: org.embulk.util.retryhelper. HttpException:. Request is not successful, code=401, body={ [0x2b][0x2c]"errors":. [{[0x39][0x3a][0x3b]"status":. "401", [0x4c][0x4d][0x4e]"code":. "NOT_AUTHORIZED", [0x68][0x69][0x6a]"title":. "Authentication credentials are missing or invalid.

Possible Causes

This error occurs when not all of the private key strings are entered, such as when deleting the ---BEGIN *--- and``---END *--- sections when entering the private key registered in the connection information used for the transfer settings.

cope

Copy the entire issued private key string, paste it into the private key field of the connection information, and save it.

Error when the necessary permissions are not granted to the Issuer ID in the connection information

Error: "Error. RetryGiveupException: org.embulk.util.retryhelper. HttpException:. Request is not successful, code=403, body={ [0x2b] "errors" :. [ {[0x3c]    "id" :. "xxxxxxxx", [0x6f] "status" :. "403", [0x85] "code" :. "FORBIDDEN_ERROR", [0xa5] "title" :. "This request is forbidden for security reasons", [0xe5] "detail" :. "The API key in use does not allow this request" [0x125] } ] [0x12b]}

Possible Causes

This error occurs when the following required permissions are not granted to the Issuer ID registered in the connection information used for forwarding settings.

cope

Please grant the authority of Finance of "Reporting and Analytics" to the Issuer ID registered in the connection information on the App Store Connect API side.

Transfer Settings

Error when acquired data does not exist at preview or job execution

Error: "Error. No input records to preview

Possible Causes

The cause of the error depends on when the error is displayed.

If an error is displayed when previewing the transfer settings STEP2

If no data can be retrieved from the transfer source set in Transfer Settings STEP1, the preview display will show an error.

If the relevant error is displayed when executing a transfer job

If a transfer job is executed with schema change detection turned on and no data is retrieved from the source, the job will fail.
Specifically, the following cases may be considered

  • Differential transfer was selected as the transfer method and no incremental records were generated after the last job execution.
    - No records were generated within the specified data acquisition period.

cope

If an error is displayed when previewing the transfer settings STEP2

  • For forwarding sources for which the data acquisition period can be set
    1. Extend the range of the data acquisition period to the period during which the data exist.
    2. Click on " Run automatic data setup " or " Preview changes in STEP 2" and see if a preview appears.
  • For forwarding sources where record refinement can be set
    1. If you have narrowed down the records by query or other means, please remove the narrowing.
    2. Click on " Run automatic data setup " or " Preview changes in STEP 2" and see if a preview appears.
  • For file/storage-based transfer sources
    1. Verify that the file exists in the specified path.
    2. Make sure you have specified the path in the correct way.

If the relevant error is displayed when executing a transfer job

If an error is displayed due to the aforementioned causes, it is expected behavior.
Run the job again after the record has been generated.

Error when memory leakage occurs due to huge amount of transferred data

OutOfMemoryError:. GC overhead limit exceeded

Possible Causes

The amount of data transferred in one job exceeds TROCCO's processing capacity,
This error occurs when the memory of TROCCO's job execution container is used up.
More specifically, this occurs in the following cases
- During data acquisition: Depending on the contents of TROCCO's transfer settings, the amount of data to be acquired is enormous.
- When data is submitted: Processing volume is enormous, depending on settings such as the number of simultaneous connections on the destination connector side.

cope

There are two possible ways to handle this.

Split transfer data

If the amount of data retrieved is huge depending on the source setting, the amount of data retrieved at one time should be reduced.
For example, adjust the following parts of the source settings to reduce the amount of data acquired.
- Transfer source file/storage system: path prefix
- Specify a deeper hierarchy to reduce the number of files retrieved at one time.
- You can also retrieve one file at a time by embedding a custom variable.
- Transferring database system: query
- Reduce the number of records retrieved at one time by writing a WHERE clause.
- By embedding custom variables in the WHERE clause, you can dynamically specify which records to retrieve for each run.
- Transfer source cloud application/advertising system: data acquisition period
- Reduce the number of records retrieved at one time by narrowing the time period.
- Custom variables can be embedded for the start and end dates of data acquisition, respectively.

By using custom variable loop execution, it is possible to transfer a huge amount of data while narrowing down the data to be retrieved at a time.

Tuning the transfer destination

Raise the limit on the number of simultaneous connections to the forwarding service. This is because when TROCCO transfers, multiple processing requests may be made to the destination service.
For example, if the destination is Snowflake, see the concurrency query limit.
In addition, running multiple jobs simultaneously in TROCCO results in many processing requests to the destination service.
Adjust the schedule settings and the number of parallel workflow executions to prevent too many jobs from running at the same time.

Error when newly added column has no default value

columns:. Column src '<new_column>' is not found in inputschema. Column '<new_column>' does not have "type" and & quot;default" Suppressed: NullPointerException

Possible Causes

This error occurs when the following conditions are met when adding a column in the column definition in Transfer Settings STEP 2 and previewing the changes.
- The original column must be "Add New".

  • Default value is not entered (blank)

If a column is manually added in the column definition, an arbitrary value must be entered in the Default value field.

cope

The error is resolved by entering default values in the newly added columns and clicking Preview Changes to apply the changes.
If you do not want to store default values in the destination, you can store empty characters in the following way
- In the column definition, enter any value for the default value.
- Example: 999

  • String Regular expression replacement replaces the value entered for the default value with an empty string.
    - Example:
    - Regular expression pattern: 999
    - String to be replaced: Enter nothing (blank)

As a result of the above method, an empty character is stored.
If you wish to store NULLs, consider using programming ETL.

Error when there is a problem with programming ETL

Error: "Error. org.embulk.exec.ExecutionInterruptedException:. java.lang.Exception:. Internal API Error

Possible Causes

This error occurs when there is an error in the code written in the programming ETL or when the amount of processing data is too large and the memory allocated for the programming ETL is used up.

cope

Please check the code and modify the code to reduce the amount of data to be processed.

Source and Destination Services provided by Google

Error when there is an error in the connection information for services provided by Google.

PartialExecutionException:. RuntimeException: java.lang. IllegalArgumentException:.
Caused by:. IllegalArgumentException:. expected primitive class, but got: a class com.google.api.client.json.GenericJson

Possible Causes

This error occurs when connection information for services provided by Google (BigQuery, Google Spreadsheets, Google Drive, etc.) is created using a JSON key and the content of the JSON key is incorrect.
This error is often caused by entering incomplete JSON keys when creating connection information.

cope

  1. Open the created JSON key value in any text editor, select all the text, and copy it.
  2. Paste it into the JSON key entry field on the connection information edit screen to save the connection information.

Was this article helpful?