Support Questions

Find answers, ask questions, and share your expertise

Oozie HIveAction throwing permission denied exception

avatar
Explorer

Hello, i have file hive.hql, and it's working well then i run from ssh hive. But then i run this file on Oozie action it failed with this exception. This user have access to this directory. This problem started from updating Oozie and HDP.

Enviroment:

  • HDP 2.2.8
  • Hive 0.14
  • Oozie 4.1.0
FAILED: HiveException java.security.AccessControlException: Permission denied: user=tech_dmon, access=WRITE, inode="/user/p/data":tech_p:bgd_p:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:185)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6886)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6868)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6793)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:9676)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1642)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1433)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2127)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2123)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2121)
Intercepting System.exit(40000)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [40000]

hive.hql

set mapreduce.job.queuename=prod;
use ${MY_VAR1};
CREATE TABLE IF NOT EXISTS `${MY_VAR1}.d_sites` ...
INSERT OVERWRITE DIRECTORY '/user/tech_dmon/D/DB' select '${MY_VAR2}' as ...

What could be causing the problem?

Thank you!

1 ACCEPTED SOLUTION

avatar
Explorer

Problem solved with updating hive.xml

View solution in original post

8 REPLIES 8

avatar
Master Guru

@ Sergey Orlov - Could you please share your job.properties and workflow.xml ? I can see in error log inode is "" it should be something like "hdfs://<namenode-host>:8020/<hdfs-path>"

avatar
Master Mentor
@Sergey Orlov

FAILED:HiveException java.security.AccessControlException:Permission denied: user=ms, access=WRITE, inode="":ms:ms:drwxr-xr-x

User ms does not have permissions to write

hdfs dfs -ls /user/ms

if it's not there then hdfs dfs -mkdir -p /user/ms

hdfs dfs -chown -R ms:hdfs /user/ms

avatar
Explorer

I am rewrite question, but hdfs dfs -ls /user/ms(tech_dmon) - it's working. I can run hive script from ssh hive, but i can't do it from Oozie hive action.

avatar
Explorer

@Kuldeep Kulkarni - about inode I have a mistake in question, i am rewrite it. And here is workflow.xml, i will hide some addresses

<workflow-app name="D_Sim-wf" xmlns="uri:oozie:workflow:0.4" >
<credentials>
<credential name='hive_auth' type='hcat'>
 <property>
    <name>hcat.metastore.uri</name>
       <value>thrift://Address</value>.
       </property>
        <property>
            <name>hcat.metastore.principal</name>
             <value>hive/address</value>
           </property>
       </credential>
   </credentials>
<start to="shelldsym"/>
  <action name="shelldsym">
    <shell xmlns="uri:oozie:shell-action:0.2">
      <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <job-xml>hive-site.xml</job-xml>
             <configuration>
              <property>
<------>        <name>mapred.job.queue.name</name>
        <------> <value>${queueName}</value>
    <--><------></property>
<------>    </configuration>
    <exec>HDynSim.sh</exec>
    <argument>${HiveDB}</argument>
     <file>/user/${us}/${working_dir}/HDynSim.sh#HDynSim.sh</file>
            <capture-output/>
        </shell>
            <ok to="hivedsim"/>
   <error to="fail"/>
 </action>
<action name="hivedsim" cred="hive_auth">
    <hive xmlns="uri:oozie:hive-action:0.2">
      <job-tracker>${jobTracker}</job-tracker>
      <name-node>${nameNode}</name-node>
      <job-xml>hive-site.xml</job-xml>
     <configuration>
          <property>
                <name>mapred.job.queue.name</name>
    <--><------><value>${queueName}</value>
          </property>
          <property>
           <name>oozie.hive.defaults</name>
           <value>hive-site.xml</value>
             </property>
    </configuration>
        <script>hive.hql</script>
         <param>MY_VAR1=${wf:actionData('shelldsym')['MY_VAR1']}</param>
         <param>MY_VAR2=${wf:actionData('shelldsym')['MY_VAR2']}</param>
    </hive>
       <ok to="javadynsim"/>
     <error to="fail"/>
</action>
 <action name="javadynsim">
 <java>
 <job-tracker>${jobTracker}</job-tracker>
 <name-node>${nameNode}</name-node>
 <job-xml>hive-site.xml</job-xml>
   <configuration>
       <property>
           <name>mapred.job.queue.name</name>
           <value>${queueName}</value>
      </property>
   </configuration>
     <main-class>DSim.DynSim</main-class>
     <arg>${MySQLDBname}</arg>
     <arg>${MySQLUserName}</arg>
     <arg>${working_dir}</arg>
    <arg>${us}</arg>
     <arg>${url}</arg>
     <arg>${SqlTableName}</arg>
     <file>lib/HadoopTest-1.0-SNAPSHOT.jar</file>
     <capture-output/>
</java>
 <ok to="end" />
  <error to="fail" />
</action>
      <kill name="fail">
              <message>Script failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
       </kill>
       <end name='end' />
  </workflow-app>

And here is job.properties

nameNode=hdfs://nameservice1
propertyLoc=coord.properties
MySQLDBname=dynsim_mon
MySQLUserName=dynsim_mon
working_dir=Dynsim
us=tech_dmon
url=address
SqlTableName=D_SIM
HiveDB=d_mon
jobTracker=address
wf_application_path=${nameNode}/user/${us}/${working_dir}
oozie.coord.application.path=${wf_application_path}
oozie.use.system.libpath=true
queueName=prod 

avatar
Master Guru

@Sergey Orlov - As per error given in question

FAILED:HiveException java.security.AccessControlException:Permission denied: user=tech_dmon, access=WRITE, inode="/user/p/data":tech_p:bgd_p:drwxr-xr-x

Could you please confirm if inode name is /user/tech_p/data or its /user/p/data ?

In any of the above case, you are running your oozie cordinator/workflow as tech_dmon user and its trying to write into some other user's user directory in HDFS instead of /user/tech_dmo/ hence you are getting permission denied error.

Could you please check application logs of this job and let me know what is the value of below properties

  1. user.name
  2. mapreduce.job.user.name

avatar
Explorer

@Kuldeep Kulkarni - inode name is /user/p/data. Hive action just read from /user/p/data(external hive table), it's not need acces to write. Then why it works from ssh hive? And this action read data as select from several similar path, but permission denied throws only this path.

1. user.name: tech_dmon

2. mapreduce.job.user.name : tech_dmon

avatar
Explorer

Problem solved with updating hive.xml

avatar

So, issue was for the reason that you didn't have correct hive-site.xml path. Accepting this answer and closing the question.