Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Sqoop Import to Hive failed "java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf"

avatar
Contributor

Hi Guys,

 

I need some help regarding sqoop import to hive, using --hive-import parameter. It return error "Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf"

 

I already tried to run this command, but still got that error

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/lib/hive/lib/*

 

And I can't find "/user/lib/hive" directory in my node

 

This is the details of sqoop command that i run

 

[eduard@msimaster1 ~]$ sqoop-import --connect jdbc:mysql://10.87.87.201:3306/msi --username root --password P@ssw0rd --table tbl_project_activity --fields-terminated-by "~" --hive-import --hive-database msi_eduard --hive-table test
Warning: /opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/01/11 09:58:28 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
18/01/11 09:58:28 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/01/11 09:58:28 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/01/11 09:58:28 INFO tool.CodeGenTool: Beginning code generation
18/01/11 09:58:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tbl_project_activity` AS t LIMIT 1
18/01/11 09:58:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tbl_project_activity` AS t LIMIT 1
18/01/11 09:58:28 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-eduard/compile/c5dfb85158f3a31c86fe9c5108bbb0ce/tbl_project_activity.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/01/11 09:58:30 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-eduard/compile/c5dfb85158f3a31c86fe9c5108bbb0ce/tbl_project_activity.jar
18/01/11 09:58:30 WARN manager.MySQLManager: It looks like you are importing from mysql.
18/01/11 09:58:30 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
18/01/11 09:58:30 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
18/01/11 09:58:30 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
18/01/11 09:58:30 INFO mapreduce.ImportJobBase: Beginning import of tbl_project_activity
18/01/11 09:58:30 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
18/01/11 09:58:31 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
18/01/11 09:58:34 INFO db.DBInputFormat: Using read commited transaction isolation
18/01/11 09:58:34 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `tbl_project_activity`
18/01/11 09:58:34 WARN db.TextSplitter: Generating splits for a textual index column.
18/01/11 09:58:34 WARN db.TextSplitter: If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records.
18/01/11 09:58:34 WARN db.TextSplitter: You are strongly encouraged to choose an integral split column.
18/01/11 09:58:34 INFO mapreduce.JobSubmitter: number of splits:4
18/01/11 09:58:34 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1515041314803_0021
18/01/11 09:58:34 INFO impl.YarnClientImpl: Submitted application application_1515041314803_0021
18/01/11 09:58:34 INFO mapreduce.Job: The url to track the job: http://msimaster1:8088/proxy/application_1515041314803_0021/
18/01/11 09:58:34 INFO mapreduce.Job: Running job: job_1515041314803_0021
18/01/11 09:58:41 INFO mapreduce.Job: Job job_1515041314803_0021 running in uber mode : false
18/01/11 09:58:41 INFO mapreduce.Job:  map 0% reduce 0%
18/01/11 09:58:46 INFO mapreduce.Job:  map 25% reduce 0%
18/01/11 09:58:48 INFO mapreduce.Job:  map 100% reduce 0%
18/01/11 09:58:48 INFO mapreduce.Job: Job job_1515041314803_0021 completed successfully
18/01/11 09:58:48 INFO mapreduce.Job: Counters: 30
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=714556
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=481
                HDFS: Number of bytes written=4489574
                HDFS: Number of read operations=16
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=8
        Job Counters
                Launched map tasks=4
                Other local map tasks=4
                Total time spent by all maps in occupied slots (ms)=18130
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=18130
                Total vcore-milliseconds taken by all map tasks=18130
                Total megabyte-milliseconds taken by all map tasks=18565120
        Map-Reduce Framework
                Map input records=25341
                Map output records=25341
                Input split bytes=481
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=447
                CPU time spent (ms)=6400
                Physical memory (bytes) snapshot=847212544
                Virtual memory (bytes) snapshot=11180744704
                Total committed heap usage (bytes)=686292992
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=4489574
18/01/11 09:58:48 INFO mapreduce.ImportJobBase: Transferred 4.2816 MB in 17.0317 seconds (257.4235 KB/sec)
18/01/11 09:58:48 INFO mapreduce.ImportJobBase: Retrieved 25341 records.
18/01/11 09:58:48 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tbl_project_activity` AS t LIMIT 1
18/01/11 09:58:48 INFO hive.HiveImport: Loading uploaded data into Hive
18/01/11 09:58:48 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
18/01/11 09:58:48 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
        at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
        at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:530)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
        ... 12 more

1 ACCEPTED SOLUTION

avatar
Contributor

I solve this one by update hive statestore after enable the High Availability for HDFS

View solution in original post

1 REPLY 1

avatar
Contributor

I solve this one by update hive statestore after enable the High Availability for HDFS