Reply
Highlighted
New Contributor
Posts: 1
Registered: ‎11-01-2017

Sqoop export from hive to teradata - > issue is due to hive column size is more than 64k bytes

[ Edited ]

Hi All,

anyone know how to load a hive table ( one of the columns  has more than 64K bytes) into teradata using sqoop?

 

i'm using below syntax.

 

sqoop export -D sqoop.connection.factories=com.cloudera.sqoop.manager.TeradataManagerFactory --connect 'jdbc:servername/database=db_name' --username **** -P --table table1 --hcatalog-table table1 --hcatalog-database dnt_data --fields-terminated-by '\001' --m 1

 

error:

com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata JDBC Driver] [TeraJDBC 14.00.00.21] [Error 1186] [SQLState HY000] Parameter 3 length is 169310 bytes, which is greater than the maximum 64000 bytes that can be set. 

 

i'm also looking to put some filter on hive table while exporting the data without creating a new hive table. ( i dont have access to create any hive  tables. i can just read them)

 

Appreicate your help.

 

 

 

Announcements