Created 01-03-2016 09:12 AM
Hi Experts ,
I am trying execute oozie job . Inside Oozie job I have a hive action which will create a table, calculate SUM and drop that table. Now when I execute this job it runs without error in YARN but throws an error in Oozie.
The exception is
Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [1]
I have verified
1. hive-site.xml
2.Verified all parameters and path.
Everything is perfect but I'm not able to figure out reason for error,
Created 01-07-2016 06:43 PM
This error is just a wrapper of the root cause. I saw this question a couple of times so I just created an article which will help you get to the bottom of the issue. See the link below -
https://community.hortonworks.com/articles/9148/tr...
My guess is that it could be because of missing JDBC jar file. It is better to place the JDBC jar in HDFS.
Here is a working example - https://github.com/sainib/hadoop-data-pipeline
Let us know if that does not help.
Created 01-04-2016 11:31 AM
There are different possibilities why a hive action in oozie might fail. Missing jars ( oozie needs to start it with shared libs and they need to be configured correctly ), security ( kerberos configured? ) or just bad SQL. You would normally find more information in the logs. Either in the yarn logs of the oozie launcher task or the hive task or in the oozie logs. Hue lets you click through all of them pretty conveniently.
Created 01-07-2016 06:43 PM
This error is just a wrapper of the root cause. I saw this question a couple of times so I just created an article which will help you get to the bottom of the issue. See the link below -
https://community.hortonworks.com/articles/9148/tr...
My guess is that it could be because of missing JDBC jar file. It is better to place the JDBC jar in HDFS.
Here is a working example - https://github.com/sainib/hadoop-data-pipeline
Let us know if that does not help.
Created 02-02-2016 01:04 AM
I have successfully run Hive action from Oozie Workflow. My simple Hive operation does:
drop table test1; create table test1 as select * from A_BASE;
Followings are the steps:
1: run su - oozie from SSH window.
2: run hdfs dfs -put /usr/hdp/2.3.2.0-2950/atlas/hook/hive/* /user/oozie/share/lib/lib_20151027124452/hive (assume HDP 2.3.2 used)
3: create a workflow that contains a Hive action.
4: add a property oozie.action.sharelib.for.hive = hive,hcatalog,sqoop in Oozie parameters.
5: create a hive script like above and upload it from "Script name" in the Hive action edit page.
6: Save the workflow.
7: Run it.
8: It should run.