Member since
11-01-2017
1
Post
0
Kudos Received
0
Solutions
11-01-2017
04:52 PM
Hi All,
anyone know how to load a hive table ( one of the columns has more than 64K bytes) into teradata using sqoop?
i'm using below syntax.
sqoop export -D sqoop.connection.factories=com.cloudera.sqoop.manager.TeradataManagerFactory --connect 'jdbc:servername/database=db_name' --username **** -P --table table1 --hcatalog-table table1 --hcatalog-database dnt_data --fields-terminated-by '\001' --m 1
error:
com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata JDBC Driver] [TeraJDBC 14.00.00.21] [Error 1186] [SQLState HY000] Parameter 3 length is 169310 bytes, which is greater than the maximum 64000 bytes that can be set.
i'm also looking to put some filter on hive table while exporting the data without creating a new hive table. ( i dont have access to create any hive tables. i can just read them)
Appreicate your help.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop