Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Sqoop export from hive to teradata - > issue is due to hive column size is more than 64k bytes

Solved Go to solution
Highlighted

Sqoop export from hive to teradata - > issue is due to hive column size is more than 64k bytes

New Contributor

Hi All,

anyone know how to load a hive table ( one of the columns  has more than 64K bytes) into teradata using sqoop?

 

i'm using below syntax.

 

sqoop export -D sqoop.connection.factories=com.cloudera.sqoop.manager.TeradataManagerFactory --connect 'jdbc:servername/database=db_name' --username **** -P --table table1 --hcatalog-table table1 --hcatalog-database dnt_data --fields-terminated-by '\001' --m 1

 

error:

com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata JDBC Driver] [TeraJDBC 14.00.00.21] [Error 1186] [SQLState HY000] Parameter 3 length is 169310 bytes, which is greater than the maximum 64000 bytes that can be set. 

 

i'm also looking to put some filter on hive table while exporting the data without creating a new hive table. ( i dont have access to create any hive  tables. i can just read them)

 

Appreicate your help.

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Sqoop export from hive to teradata - > issue is due to hive column size is more than 64k bytes

Guru

This is an old thread, but I will put in my findings recently.

It is a limitation/restriction from Teradata side that data with larger than 64KB are required to use special API to be streamed into Teradata. Currently Sqoop does not make use of this API, so it does not support injecting data larger than 64KB into Teradata.

Improvement JIRA has been requested, but not resolved at this stage. For the time being, have to either reduce the data or use another DB.

I have tested using MySQL's CLOB has no issues.

Cheers
Eric

View solution in original post

1 REPLY 1

Re: Sqoop export from hive to teradata - > issue is due to hive column size is more than 64k bytes

Guru

This is an old thread, but I will put in my findings recently.

It is a limitation/restriction from Teradata side that data with larger than 64KB are required to use special API to be streamed into Teradata. Currently Sqoop does not make use of this API, so it does not support injecting data larger than 64KB into Teradata.

Improvement JIRA has been requested, but not resolved at this stage. For the time being, have to either reduce the data or use another DB.

I have tested using MySQL's CLOB has no issues.

Cheers
Eric

View solution in original post

Don't have an account?
Coming from Hortonworks? Activate your account here