Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Please see the Cloudera blog for information on the Cloudera Response to CVE-2021-4428

How to fix "Make sure HIVE_CONF_DIR is set correctly" error from SQOOP import?

New Contributor

Problem: Inside Hortonworks sandbox, using the WorkFlow-Manager UI within Ambari to run a Sqoop import to Hive from a MYSQL database, the sqoop job would get killed with error "Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly."

(FYI: Found this error by running >yarn logs -applicationId <application ID>

Got the Application ID from oozie UI)

At first, Could not find a fix even though searches online says command will fix: export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:your_hive_lib_path

Ultimately this did not work for me.

Believe this is due to oozie not being able to find the Hive lib for some reason. Same sqoop import command works just fine from cmd line and also zeppelin shell.

Resolution: However the fix I found was within the workflow manager. I select the sqoop job I was trying to run, went to Advanced Properties, went to the include File input. Selected Browse, Went to /user/hive/.hivejars and selected the hive.exec jar within there. Attaching the hive-exec jar to the sqoop job allowed the workflow/import to execute successfully.

This is posted as a question, but resolution is given. Wanted to post this just in case someone else was having same problem, hope this helps. -Doug

1 REPLY 1

@Douglas Stroer

You can post this as a How-to article by clicking on create. That would give you more rep points as well.

Thanks