Member since
12-24-2015
5
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3744 | 12-28-2015 08:10 AM |
12-28-2015
08:10 AM
Problem solved with updating hive.xml
... View more
12-28-2015
04:56 AM
@Kuldeep Kulkarni - inode name is /user/p/data. Hive action just read from /user/p/data(external hive table), it's not need acces to write. Then why it works from ssh hive? And this action read data as select from several similar path, but permission denied throws only this path. 1. user.name: tech_dmon 2. mapreduce.job.user.name : tech_dmon
... View more
12-25-2015
04:37 AM
I am rewrite question, but hdfs dfs -ls /user/ms(tech_dmon) - it's working. I can run hive script from ssh hive, but i can't do it from Oozie hive action.
... View more
12-25-2015
04:31 AM
@Kuldeep Kulkarni - about inode I have a mistake in question, i am rewrite it. And here is workflow.xml, i will hide some addresses <workflow-app name="D_Sim-wf" xmlns="uri:oozie:workflow:0.4" >
<credentials>
<credential name='hive_auth' type='hcat'>
<property>
<name>hcat.metastore.uri</name>
<value>thrift://Address</value>.
</property>
<property>
<name>hcat.metastore.principal</name>
<value>hive/address</value>
</property>
</credential>
</credentials>
<start to="shelldsym"/>
<action name="shelldsym">
<shell xmlns="uri:oozie:shell-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>hive-site.xml</job-xml>
<configuration>
<property>
<------> <name>mapred.job.queue.name</name>
<------> <value>${queueName}</value>
<--><------></property>
<------> </configuration>
<exec>HDynSim.sh</exec>
<argument>${HiveDB}</argument>
<file>/user/${us}/${working_dir}/HDynSim.sh#HDynSim.sh</file>
<capture-output/>
</shell>
<ok to="hivedsim"/>
<error to="fail"/>
</action>
<action name="hivedsim" cred="hive_auth">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>hive-site.xml</job-xml>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<--><------><value>${queueName}</value>
</property>
<property>
<name>oozie.hive.defaults</name>
<value>hive-site.xml</value>
</property>
</configuration>
<script>hive.hql</script>
<param>MY_VAR1=${wf:actionData('shelldsym')['MY_VAR1']}</param>
<param>MY_VAR2=${wf:actionData('shelldsym')['MY_VAR2']}</param>
</hive>
<ok to="javadynsim"/>
<error to="fail"/>
</action>
<action name="javadynsim">
<java>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>hive-site.xml</job-xml>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<main-class>DSim.DynSim</main-class>
<arg>${MySQLDBname}</arg>
<arg>${MySQLUserName}</arg>
<arg>${working_dir}</arg>
<arg>${us}</arg>
<arg>${url}</arg>
<arg>${SqlTableName}</arg>
<file>lib/HadoopTest-1.0-SNAPSHOT.jar</file>
<capture-output/>
</java>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Script failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name='end' />
</workflow-app>
And here is job.properties nameNode=hdfs://nameservice1
propertyLoc=coord.properties
MySQLDBname=dynsim_mon
MySQLUserName=dynsim_mon
working_dir=Dynsim
us=tech_dmon
url=address
SqlTableName=D_SIM
HiveDB=d_mon
jobTracker=address
wf_application_path=${nameNode}/user/${us}/${working_dir}
oozie.coord.application.path=${wf_application_path}
oozie.use.system.libpath=true
queueName=prod
... View more
12-24-2015
10:50 AM
1 Kudo
Hello, i have file hive.hql, and it's working well then i run from ssh hive. But then i run this file on Oozie action it failed with this exception. This user have access to this directory. This problem started from updating Oozie and HDP.
Enviroment:
HDP 2.2.8
Hive 0.14
Oozie 4.1.0
FAILED: HiveException java.security.AccessControlException: Permission denied: user=tech_dmon, access=WRITE, inode="/user/p/data":tech_p:bgd_p:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:185)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6886)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6868)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6793)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:9676)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1642)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1433)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2127)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2123)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2121)
Intercepting System.exit(40000)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [40000]
hive.hql set mapreduce.job.queuename=prod;
use ${MY_VAR1};
CREATE TABLE IF NOT EXISTS `${MY_VAR1}.d_sites` ...
INSERT OVERWRITE DIRECTORY '/user/tech_dmon/D/DB' select '${MY_VAR2}' as ...
What could be causing the problem?
Thank you!
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Oozie