Community Articles
Find and share helpful community-sourced technical articles
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.
Labels (1)

Performing a sqoop import using 'hive-import' into a data type char() or varchar() fails with

17/06/05 14:04:44 INFO mapreduce.Job: Task Id : attempt_1496415095220_0016_m_000002_0, Status : FAILED
Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

Working example. Create the Teradata and hive tables as follows:

Terdata> create table td_importme_into_hive (col1 int not null primary key,  col2 char(30));

Hive> create table td_import (col1 int, col2 char(30));

Execute

sqoop import  --connection-manager {connection info} \
              --table td_importme_into_hive  --hive-import --hive-table td_import \
              -m 1 --split-by col1

This will fail as char / varchar are not supported Hive datatypes for sqoop import from Teradata.

Create the hive table with datatype string instead of char() or varchar().

482 Views
0 Kudos
Comments
Not applicable

still issue exists?

Don't have an account?
Coming from Hortonworks? Activate your account here
Version history
Revision #:
1 of 1
Last update:
‎06-11-2017 09:55 AM
Updated by:
 
Contributors
Top Kudoed Authors