Community Articles

Find and share helpful community-sourced technical articles.
Labels (1)
avatar

Performing a sqoop import using 'hive-import' into a data type char() or varchar() fails with

17/06/05 14:04:44 INFO mapreduce.Job: Task Id : attempt_1496415095220_0016_m_000002_0, Status : FAILED
Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

Working example. Create the Teradata and hive tables as follows:

Terdata> create table td_importme_into_hive (col1 int not null primary key,  col2 char(30));

Hive> create table td_import (col1 int, col2 char(30));

Execute

sqoop import  --connection-manager {connection info} \
              --table td_importme_into_hive  --hive-import --hive-table td_import \
              -m 1 --split-by col1

This will fail as char / varchar are not supported Hive datatypes for sqoop import from Teradata.

Create the hive table with datatype string instead of char() or varchar().

1,952 Views
0 Kudos
Comments

still issue exists?