Error Messages
  • 24 Jan 2023
  • Dark
    Light
  • PDF

Error Messages

  • Dark
    Light
  • PDF

Article Summary

Note

This is a machine-translated version of the original Japanese article.
Please understand that some of the information contained on this page may be inaccurate.

This page is displayed by trocco explaining the cause of the error message and how to solve it.

Error messages corresponding to this article

This page introduces the error messages displayed in the following logs.

  • Preview error log
  • Execution error log
  • Execution log

Transfer toBigQuery

Possible errors when connecting to BigQuery

org.embulk.exec.PartialExecutionException: java.net.SocketException: Connection or outbound has closed

Probable cause

This is an error that rarely occurs when trocco connects to the transfer destination BigQuery, due to network interruption.
In most cases, you can resolve the error by rerunning the job.

cope

You can avoid errors by adding retry settings to the transfer settings or workflow definitions.

  • For forwarding settings:
    • On the Job Settings tab of Transfer Settings STEP2, set the maximum number of retries to 1 or more.
  • For workflow definitions:
    • Set the number of retries in the job execution settings to 1 or more.

Error when submitting a nested column in JSON into BigQuery

Field <カラム名> is type RECORD but has no schema.

Probable cause

This error occurs when trying to transfer a column defined as a JSON type in trocco as a RECORD type.
Specifically, errors occur in the following situations:

  • In the column definition on the Data Settings tab of the Transfer Setting STEP2, select the column set as JSON as the data type.
  • In the column settings of the Output Options tab of the Transfer Settings STEP2, set RECORD as the data type and try to transfer.

cope

It can be dealt with by creating a table to be treated as a template in advance on BigQuery.

  1. On the BigQuery side, create a table with the same schema as the table you want to transfer this time under the dataset to be transferred.
  2. In trocco, in the table that references the schema information as a template on the Output Options tab of the transfer setting STEP2, enter the table name created in the previous step.

Transfer to kintone

Error when transferring a column with an unsupported data type on the destination table

Caused by: com.kintone.client.exception.KintoneApiRuntimeException: HTTP error status 400, {"code":"CB_IJ01","id":"<ID>","message":"不正なJSON文字列です。"}

Probable cause

This error occurs when the data type of a column defined in TROCCO does not correspond to the data type defined in the transfer destination kintone table.
Note that the above error message is"code":"CB_IJ01" output when the value (JSON formed by trocco) is incorrect for the transfer destination table.

cope

Set a different data type in the column settings on the Output Options tab of the Transfer Settings STEP2.

Transfer from BigQuery

Error when referencing Google Spreadsheets as an external table

bigquery job failed: Access Denied: BigQuery BigQuery: Permission denied while getting Drive credential.

Probable cause

This error occurs when the BigQuery table from which the data is retrieved references Google Spreadsheets as an external table.
Currently, trocco's source BigQuery does not support the transfer of tables that reference Google Spreadsheets as external tables.

cope

Create a table that does not reference an external table in BigQuery and set that table as the data source table.

Source MySQL

Error when connection to MySQL cannot be established

Error: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

Probable cause

This error occurs when a connection with MySQL cannot be established.
Specifically, the following cases are possible.

  • The storage capacity on the MySQL side is FULL and the restart is repeated.
  • There is no response from MySQL side for the time within the timeout setting value.
    • This phenomenon may occur when changes such as version upgrades are made to MySQL.

cope

  • First, check the startup status on the MySQL side.
  • If there is no problem with the MySQL startup status, do one of the following:
    • In the Input Options tab of trocco's Transfer Settings STEP2, extend the socket timeout value.
    • Extend the value of the MySQL net_read_timeout.

Transfer from Redshift

Error when the source data exceeds 1000 rows

Fetch size 10000 exceeds the limit of 1000 for a single node configuration. Reduce the client fetch/cache size or upgrade to a multi node installation

Probable cause

Due to the restriction in Redshift that fetch size cannot be fetched if it is more than 1000 rows, an error occurs when the data retrieved by trocco exceeds 1000 rows.
You need to set the fetch size to 1000 rows or less.

cope

In the Input Options tab of trocco's Transfer Settings STEP2, change the number of records that the cursor processes at one time to 1000 or less.

Source file storage system (S3, GCS, etc.)


For connectors that may display this error message, refer to File Storage Connectors.

Error when containing characters that do not correspond to numeric type / datetime type column

org.embulk.spi.DataException: Invalid record at <行数>
Caused by: org.embulk.standards.CsvParserPlugin$CsvRecordValidateException: java.lang.NumberFormatException: For input string: ""

Probable cause

This error occurs when a column inferred as a numeric type or datetime type contains an unsupported character (such as Null or empty character).
According to the Embulk specification, trocco generates an error if a numeric type or datetime type column contains an unmatched character.

More specifically, it occurs in the following cases:

  • trocco transfer settings STEP2In automatic data settings executed at the time of transition, the data type of the column is inferred as a numeric type or a datetime type.
  • A column whose data type is inferred as numeric or datetime contains unsupported characters.

cope

There are two possible solutions.

Convert to string type and transfer

  1. In the column definition on the Data Settings tab of the Transfer Settings STEP2, set the data type of the corresponding column to string.
  2. Click Preview changes.

Transfer unsupported characters by replacing them with arbitrary numbers

* Depending on the transfer destination, the column settings on the Output Options tab cannot be set, so the following measures may not be taken.

  1. In the regular expression substitutionof the string on the data setting tab of the transfer setting STEP2, characters that are not subject to the above are converted to arbitrary numbers.
  2. In the column settings on the Output Options tab of the Transfer Settings STEP2, set the data type of the corresponding column to a numeric type (such as INTAGER type).

Transfer fromSalesforce

Error due to insufficient settings on the Salesforce side

Setup::Error::ConfigError: (INVALID_LOGIN) INVALID_LOGIN: Invalid username, password, security token; or user locked out.

Probable cause

This error occurs when you cannot log in to Salesforce from trocco.
Specifically, the following cases are possible.

  • IP restrictions are imposed on the Salesforce side, and trocco IP addresses are not allowed
  • Restricting Salesforce login URLs
  • The Salesforce account linked to the connection information does not have permission to use the API.

cope

Error when a numeric column contains a non-numeric value

cannot cast String to long: "-"NumberFormatException: For input string: "-"

Probable cause

Error that occurs when a numeric column contains non-numeric characters.
In the above error message, the error is caused by a symbol (hyphen) in a numeric type column.

cope

There are two possible solutions.

Convert to string type and transfer

  1. In the column definition on the Data Settings tab of the Transfer Settings STEP2, set the data type of the corresponding column to string.
  2. Click Preview changes.

Transfer unsupported characters by replacing them with arbitrary numbers

* Depending on the transfer destination, the column settings on the Output Options tab cannot be set, so the following measures may not be taken.

  1. In the regular expression substitutionof the string on the data setting tab of the transfer setting STEP2, characters that are not subject to the above are converted to arbitrary numbers.
  2. In the column settings on the Output Options tab of the Transfer Settings STEP2, set the data type of the corresponding column to a numeric type (such as INTAGER type).

Transfer fromFacebook Ads Insights

Error due to aborted processing in Facebook API

async was aborted because the number of retries exceeded the limit

Probable cause

This error occurs when a request to the Facebook API takes too long (about 15 minutes or more) and the Facebook API automatically aborts the process.
This error may be caused by too much acquired data.

cope

In trocco's transfer settings STEP1, narrow the range of data acquisition periods.

Forwarding settings

Programming Error when there is a problem with ETL

Error: org.embulk.exec.ExecutionInterruptedException: java.lang.Exception: Internal API Error

Probable cause

This error occurs when there is an error in the code written in the programming ETL, or when the amount of data processed is too large and the memory allocated to the programming ETL is exhausted.

cope

Review the code and modify it to reduce the amount of data processed.

Services provided by the source and destination Google

Errors when the connection information for the service provided by Google is incorrect

org.embulk.exec.PartialExecutionException: java.lang.RuntimeException: java.lang.IllegalArgumentException:
Caused by: java.lang.IllegalArgumentException: expected primitive class, but got: class com.google.api.client.json.GenericJson

Probable cause

This error occurs when the connection information of a service provided by Google (BigQuery, Google Spreadsheets, Google Drive, etc.) is created using a JSON key, and the content of the JSON key is incorrect.
This error is often caused by entering the JSON key in an incomplete form when creating connection information.

cope

  1. Open the value of the created JSON key in any text editor, select all the text, and copy it.
  2. Paste it into the JSON key input item on the connection information edit screen to save the connection information.



Was this article helpful?