Support Questions

Find answers, ask questions, and share your expertise

Error in Sqoop Import

avatar
Explorer

Hi, i am trying to import a table present in mysql to hive in parquet format. My table in mysql contains few date and timestamp columns. While importing the table to hive, the date and timestamp columns are automatically converted to bigint. Could some one help me out in resolving this issue? I have provided the table structure present in mysql and sqoop command that i have used. CREATE TABLE customer ( CUSTOMER_ID DECIMAL(27,0) NOT NULL, ABBOTT_CUSTOMER_ID VARCHAR(30), NUCLEUS_ID DECIMAL(27,0), ABBOTT_BEST_SPECIALTY_CODE VARCHAR(30), CUSTOMER_TYPE VARCHAR(100), PRESCRIBER_FLAG VARCHAR(1), FIRST_NAME VARCHAR(50), LAST_NAME VARCHAR(50), BIRTH_DATE DATE, NPI_NUMBER VARCHAR(100), EMAIL_ADDRESS VARCHAR(100), STANDARD_EMAIL_ADDRESS VARCHAR(100), DEA_REVOKED VARCHAR(20), DELETE_DATE TIMESTAMP(0), CREATION_DATE DATE, PDRP_SIGNUP_DATE DATE , PDRP_COMPLIANCE_DATE DATE, PDRP_FLAG CHAR(1), CREATED_BY VARCHAR(100), CREATED_DATE TIMESTAMP(0), MODIFIED_BY VARCHAR(100), MODIFIED_DATE TIMESTAMP(0), CUSTOMER_START_DATE TIMESTAMP(0), CUSTOMER_END_DATE TIMESTAMP(0), DDS_START_DATE TIMESTAMP(0), DDS_END_DATE TIMESTAMP(0), DDS_ACTIVE_FLAG CHAR(1), PRIMARY KEY (CUSTOMER_ID)); --Hive import sqoop import --connect "jdbc:mysql://quickstart.cloudera:3306/abbvie" --username root --password cloudera --table customer --hive-import --hive-table customer --create-hive-table --as-parquetfile --map-column-hive created_date=STRING,modified_date=STRING --incremental append --check-column "modified_date" --last-value "2017-03-31 04:22:24.0" Also, in sqoop, i am getting Warning. hdfs.DFSClient: Caught exception. Please help me out in resolving this issue. Regards, Kesav

6 REPLIES 6

avatar
Super Guru
Can you please share what was the data look like in MySQL and what it looks like in Hive after imported?

Can you show the output of "SHOW CREATE TABLE customer" in Hive?

avatar
Explorer

Hi Eric,

I am importing the data from Teradata table using sqoop. There are few date and timestamp columns present in the table. I have given the output file format as avro. In avsc file, i could see the timestamp and date data type are being changed to Long and time stamp values are being changed to long in output file. But, i want the timestamp as it is to be present in output file. Please help me out in fixing the issue.

 

Regards,

Kesav

avatar
Explorer

Hi Eric,

 Please find below the sqoop command that i am using to import the data to hive. i am getting invalid timestamp error.

 

sqoop import -Dhadoop.security.credential.provider.path=jceks://hdfs/keystore/teradatad.password.jceks \
--connect "jdbc:teradata://10.72.23.136/DATABASE=DDSVD" --username DDSVAWSSQHUMDLREADD --password-alias teradatad.password.alias \
--table customer_dl --hive-import --hive-table customer_ddsd \
--as-textfile -m 1 --split-by customer_id \
--incremental append --check-column "modified_date" --last-value "2017-03-24 00:18:46.0"

 

 

Error: com.teradata.connector.common.exception.ConnectorException: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.10.00.26] [Error 6760] [SQLState HY000] Invalid timestamp.
at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:309)
at com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:103)
at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:311)
at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:200)
at com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:137)
at com.teradata.jdbc.jdbc_4.statemachine.PreparedStatementController.run(PreparedStatementController.java:46)

avatar
Super Collaborator

Well "2017-03-24 00:18:46.0" is basically a string.

But I guess the field "modified_date" is a timestamp in your teradata database.

 

Try supplying a timestamp value in the "last-value" parameter.

avatar
Explorer

Hi,

 Yes. the modified date is declared as timestamp(0) in teradata. I tried passing timestamp value as well "2017-03-24 00:18:46" but still got the same error. Even i tried using cast function but it got error out. Could you please tell me how to pass timestamp? When i used this value "2017-03-24 00:18:46" in --where clause it is working fine. 

 

Regards,

Kesav

avatar
Expert Contributor

Did you find any solution for this issue?