Member since
04-24-2017
4
Posts
0
Kudos Received
0
Solutions
11-22-2017
05:56 PM
Thank you @Mugdha and @Md Ali.
... View more
04-24-2017
01:24 PM
Hello: Previous Thread I'm using HDP 2.4 (Centos 6.7). I'm using the Hortonworks Teradata Connector which I've installed in /usr/hdp/2.4.0.0-169/hadoop-hdfs. I believe the connector is installed correctly, but I am getting the following error when running my sqoop script. Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:195)
at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.<init>(ConnectorOutputFormat.java:91)
at com.teradata.connector.common.ConnectorOutputFormat.getRecordWriter(ConnectorOutputFormat.java:38)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:647)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
17/04/24 02:26:42 INFO mapreduce.Job: Task Id : attempt_1492981007746_0002_m_000000_1, Status : FAILED
Error: org.apache.hadoop.fs.FileAlreadyExistsException: /user/cdwadmin/temp_022543/part-m-00000 for client 192.168.164.137 already exists Here is my script: sqoop import --connect "jdbc:teradata://33.33.333.33/DATABASE=SQL_CLASS" --connection-manager org.apache.sqoop.teradata.TeradataConnManager --username <user> --password <password> --query 'SELECT Subscriber_No ,Street ,City ,"State" ,Zip ,AreaCode ,Phone FROM SQL_CLASS.Addresses WHERE $CONDITIONS' -m 4 --hive-import --hive-table sql_class.Addresses7 --split-by Subscriber_No --target-dir /user/cdwadmin/Addresses7438636T
... View more
Labels:
- Labels:
-
Apache Sqoop
04-24-2017
01:21 PM
This error no longer appears after running: hadoop fs -mkdir -p /user/cdwadmin However, I'm now getting the ClassNotFoundException below: 17/04/24 02:26:36 INFO mapreduce.Job: Task Id : attempt_1492981007746_0002_m_000001_0, Status : FAILED
4/23/2017 9:26:37 PM Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException ............. Error: org.apache.hadoop.fs.FileAlreadyExistsException: /user/cdwadmin/temp_022543/part-m-00000 for client 192.168.164.137 already exists I'll start another thread for this.
... View more
04-24-2017
01:21 PM
Hello:
I'm trying to move data from Teradata to Hadoop using the Hortonworks Teradata Connector. I believe the connector is installed properly under /usr/hdp/2.4.0.0-169/sqoop/lib.
I created a new user (cdwadmin) and gave them root privileges.
The error I am receiving when running my sqoop script is "Hive current user directory not exists".
17/04/24 01:47:09 ERROR teradata.TeradataSqoopImportHelper: Exception running Teradata import job
com.teradata.connector.common.exception.ConnectorException: Hive current user directory not exists
at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:142)
at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:58)
at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:374)
at org.apache.sqoop.teradata.TeradataConnManager.importQuery(TeradataConnManager.java:532)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:499)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
17/04/24 01:47:09 INFO teradata.TeradataSqoopImportHelper: Teradata import job completed with exit code 1 sqoop import --connect "jdbc:teradata://33.33.333.33/DATABASE=SQL_CLASS" --connection-manager org.apache.sqoop.teradata.TeradataConnManager --username <user> --password <password> --query 'SELECT Subscriber_No ,Street ,City ,"State" ,Zip ,AreaCode ,Phone FROM SQL_CLASS.Addresses WHERE $CONDITIONS' -m 4 --hive-import --hive-table sql_class.Addresses7 --split-by Subscriber_No --target-dir /user/cdwadmin/Addresses7438636T I'm using HDP 2.4 (Centos 6.7). My understanding is I need to create a directory for this user, but I don't know where or how to do this? Any help would be appreciated. Thank you.
... View more
- Tags:
- Hadoop Core
- hdp2.4
- Hive
- Sqoop
- teradata
- Upgrade to HDP 2.5.3 : ConcurrentModificationException When Executing Insert Overwrite : Hive
Labels: