Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException

avatar
Explorer

Hello:

Previous Thread

I'm using HDP 2.4 (Centos 6.7).

I'm using the Hortonworks Teradata Connector which I've installed in /usr/hdp/2.4.0.0-169/hadoop-hdfs.

I believe the connector is installed correctly, but I am getting the following error when running my sqoop script.

Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:195) at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.<init>(ConnectorOutputFormat.java:91) at com.teradata.connector.common.ConnectorOutputFormat.getRecordWriter(ConnectorOutputFormat.java:38) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:647) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) 17/04/24 02:26:42 INFO mapreduce.Job: Task Id : attempt_1492981007746_0002_m_000000_1, Status : FAILED Error: org.apache.hadoop.fs.FileAlreadyExistsException: /user/cdwadmin/temp_022543/part-m-00000 for client 192.168.164.137 already exists

Here is my script:

sqoop import --connect "jdbc:teradata://33.33.333.33/DATABASE=SQL_CLASS" --connection-manager org.apache.sqoop.teradata.TeradataConnManager  --username <user> --password <password> --query 'SELECT Subscriber_No ,Street ,City ,"State" ,Zip ,AreaCode ,Phone FROM SQL_CLASS.Addresses WHERE $CONDITIONS' -m 4 --hive-import --hive-table sql_class.Addresses7 --split-by Subscriber_No --target-dir /user/cdwadmin/Addresses7438636T
1 ACCEPTED SOLUTION

avatar
New Contributor

@Todd Wilson

Got the same error ,but followed these steps and import the data successfully.

step 1: ---------

export HADOOP_CLASSPATH=/usr/local/hive/lib/* 
export LIB_JARS=/usr/lib/tdch/1.4/lib/terajdbc4.jar,/usr/local/hive/lib/hive-metastore-0.13.1-SNAPSHOT.jar,/usr/local/hive/lib/hive-exec-0.13.1-SNAPSHOT.jar,/usr/local/hive/lib/hive-cli-0.13.1-SNAPSHOT.jar,/usr/local/hive/lib/jdo-api-3.0.1.jar,/usr/local/hive/lib/libfb303-0.9.0.jar,/usr/local/hive/lib/libthrift-0.9.0.jar,/usr/local/hive/lib/antlr-runtime-3.4.jar,/usr/local/hive/lib/datanucleus-core-3.2.10.jar,/usr/local/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/local/hive/lib/datanucleus-rdbms-3.2.9.jar 

step 2: ---------

CREATE TABLE `test`(
`colum1` int,
`colum2` int,
`colum3` string
  )
STORED AS ORC; 

step 3: ---------

hadoop jar /usr/lib/tdch/1.4/lib/teradata-connector-1.4.4.jar \
com.teradata.connector.common.tool.ConnectorImportTool \
-libjars $LIB_JARS \
-url jdbc:teradata://<DB_NAME>/database=SCHEMA_NAME \
-username <username> \
-password <password> \
-jobtype hive -hiveconf 'file:///etc/hive/conf/hive-site.xml' \
-fileformat orcfile \
-sourcetable test \
-targetdatabase default \
-targettable test \
-nummappers 16 \
-method split.by.hash \
-splitbycolumn colum1

View solution in original post

3 REPLIES 3

avatar

avatar
New Contributor

@Todd Wilson

Got the same error ,but followed these steps and import the data successfully.

step 1: ---------

export HADOOP_CLASSPATH=/usr/local/hive/lib/* 
export LIB_JARS=/usr/lib/tdch/1.4/lib/terajdbc4.jar,/usr/local/hive/lib/hive-metastore-0.13.1-SNAPSHOT.jar,/usr/local/hive/lib/hive-exec-0.13.1-SNAPSHOT.jar,/usr/local/hive/lib/hive-cli-0.13.1-SNAPSHOT.jar,/usr/local/hive/lib/jdo-api-3.0.1.jar,/usr/local/hive/lib/libfb303-0.9.0.jar,/usr/local/hive/lib/libthrift-0.9.0.jar,/usr/local/hive/lib/antlr-runtime-3.4.jar,/usr/local/hive/lib/datanucleus-core-3.2.10.jar,/usr/local/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/local/hive/lib/datanucleus-rdbms-3.2.9.jar 

step 2: ---------

CREATE TABLE `test`(
`colum1` int,
`colum2` int,
`colum3` string
  )
STORED AS ORC; 

step 3: ---------

hadoop jar /usr/lib/tdch/1.4/lib/teradata-connector-1.4.4.jar \
com.teradata.connector.common.tool.ConnectorImportTool \
-libjars $LIB_JARS \
-url jdbc:teradata://<DB_NAME>/database=SCHEMA_NAME \
-username <username> \
-password <password> \
-jobtype hive -hiveconf 'file:///etc/hive/conf/hive-site.xml' \
-fileformat orcfile \
-sourcetable test \
-targetdatabase default \
-targettable test \
-nummappers 16 \
-method split.by.hash \
-splitbycolumn colum1

avatar
Explorer

Thank you @Mugdha and @Md Ali.