Community Articles
Find and share helpful community-sourced technical articles.
Labels (1)

Performing a sqoop import using 'hive-import' into a data type char() or varchar() fails with

17/06/05 14:04:44 INFO mapreduce.Job: Task Id : attempt_1496415095220_0016_m_000002_0, Status : FAILED
Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException

Working example. Create the Teradata and hive tables as follows:

Terdata> create table td_importme_into_hive (col1 int not null primary key,  col2 char(30));

Hive> create table td_import (col1 int, col2 char(30));


sqoop import  --connection-manager {connection info} \
              --table td_importme_into_hive  --hive-import --hive-table td_import \
              -m 1 --split-by col1

This will fail as char / varchar are not supported Hive datatypes for sqoop import from Teradata.

Create the hive table with datatype string instead of char() or varchar().

0 Kudos
Not applicable

still issue exists?

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.
Version history
Last update:
‎06-11-2017 09:55 AM
Updated by:
Top Kudoed Authors
; ;