Support Questions

Find answers, ask questions, and share your expertise

Failing Oozie Launcher threw exception, hive-site.xml (Permission denied)

avatar
Rising Star

Oozie HIve action failed

Oozie Hive action configuration 
================================================================= 


Using action configuration file /hadoop/data01/hadoop/yarn/local/usercache/hadoopdev/appcache/application_1443111597609_2691/container_1443111597609_2691_01_000002/action.xml 
------------------------ 
Setting env property for mapreduce.job.credentials.binary to: /hadoop/data01/hadoop/yarn/local/usercache/hadoopdev/appcache/application_1443111597609_2691/container_1443111597609_2691_01_000002/container_tokens 
------------------------ 
------------------------ 
Setting env property for tez.credentials.path to: /hadoop/data01/hadoop/yarn/local/usercache/hadoopdev/appcache/application_1443111597609_2691/container_1443111597609_2691_01_000002/container_tokens 
------------------------ 


<<< Invocation of Main class completed <<< 


Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], main() threw exception, hive-site.xml (Permission denied) 
java.io.FileNotFoundException: hive-site.xml (Permission denied) 
at java.io.FileOutputStream.open(Native Method) 
at java.io.FileOutputStream.<init>(FileOutputStream.java:221) 
at java.io.FileOutputStream.<init>(FileOutputStream.java:110) 
at org.apache.oozie.action.hadoop.HiveMain.setUpHiveSite(HiveMain.java:166) 
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:196) 
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38) 
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:606) 
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225) 
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) 
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430) 
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342) 
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) 
at java.security.AccessController.doPrivileged(Native Method) 
at javax.security.auth.Subject.doAs(Subject.java:415) 
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594) 
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) 


Oozie Launcher failed, finishing Hadoop job gracefully 

Oozie action is configured like this:

<action name="HivePartitionAction"> 
<hive xmlns="uri:oozie:hive-action:0.3"> 
<job-tracker>${jobTracker}</job-tracker> 
<name-node>${nameNode}</name-node> 
<job-xml>script/hive-site.xml</job-xml> 
<job-xml>script/xasecure-audit.xml</job-xml> 
<job-xml>script/xasecure-hive-security.xml</job-xml> 
<configuration> 
<property> 
<name>mapred.job.queue.name</name> 
<value>${queueName}</value> 
</property> 
</configuration> 
<script>script/addPartition.sql</script> 
<file>script/xasecure-audit.xml</file> 
<file>script/xasecure-hive-security.xml</file> 
<file>script/xasecure-policymgr-ssl.xml</file> 
<file>script/hive-site.xml</file> 
</hive> 
<ok to="end"/> 
<error to="kill"/> 
</action> 

-- and the files exist OK in the /script subdirectory of the workflow directory. The files have read/write permission for all users.

-- 'hive-site.xml' under /etc/oozie/conf/action-conf/hive is owned by oozie user and set permission 644

Can anyone advise? Thanks.

1 ACCEPTED SOLUTION

avatar
Master Guru
10 REPLIES 10

avatar

I think it should have worked, anyways a workaround you could try is to rename hive-site.xml to oozie-hive-site.xml under script directory and use that path in the workflow xml.

avatar
Rising Star

Hi Deepesh, can you shed some light on this?

Does the workaround have to be like oozie-hive-site.xml, or rename to any name is ok?

I've tried renaming to hive-config.xml but no luck.

avatar

Any name other than hive-site.xml should have potentially worked. Will update here if I think of anything else. One question (maybe a stupid one) but have you uploaded this on HDFS after changes?

avatar
Expert Contributor

I faced similar issues and asked the same questions: http://community.hortonworks.com/questions/801/oozie-hive-action-hive-sitexml-permission-denied.html

Not received any response yet, but we can reach out to Engg team.

avatar

One more thing you can try is to give the absolute path to the hive-site.xml in job-xml tag.

avatar

AFAIK, hive-site.xml file can be uploaded to HDFS and used with same name. Here is an Falcon project I created that does the same.

https://github.com/sainib/hadoop-data-pipeline

The workflow defines the path to the hive-site.xml using a param like this -

https://github.com/sainib/hadoop-data-pipeline/blob/master/falcon/workflow/workflow.xml#L95

& the param is defined in the process entity file like this -

https://github.com/sainib/hadoop-data-pipeline/blob/master/falcon/process/processData.xml#L31

I am wondering if Falcon is looking for the hive-site xml where it should be or where you think it should. Can you try with the absolute path of the file on HDFS and give it a try?

avatar
Rising Star

I forgot to mention that Ranger (XA secure) 3.5 is installed in this cluster.

avatar
Master Mentor
@Vincent Jiang

here's a working sample using hive-site.xml with pig script, it can just as well be hive.

avatar
Master Mentor

@Vincent Jiang please accept best answer or provide your own solution.