- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Avoid Oozie running Hive action with user credential
- Labels:
-
Apache Hive
-
Apache Oozie
Created ‎02-27-2017 01:34 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I want to run a very simple Oozie workflow with a Hive action on a kerberized cluster.
The problem is that Hive is using my credential and not the Hive-user as it is doing through Hive View. If I change my access in Ranger for "/apps/..." then the Oozie workflow is working fine. But we don't want personal account to have access for "/apps/..." folder
How is it possible to achieve do a Hive action where don't have access to "/apps"..." folder on HDFS?
== WORKFLOW.XML ==
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <workflow-app xmlns="uri:oozie:workflow:0.5" name="oozie_hive_kerberos_test"> <credentials> <credential name="hcat" type="hcat"> <property> <name>hcat.metastore.principal</name> <value>hive/_HOST@<host>.com</value> </property> <property> <name>hcat.metastore.uri</name> <value>thrift://<host>.com:9083</value> </property> </credential> </credentials> <start to="hive"/> <action cred="hcat" name="hive"> <hive xmlns="uri:oozie:hive-action:0.6"> <job-tracker>${resourceManager}</job-tracker> <name-node>${nameNode}</name-node> <query> use XXXXX; drop table if exists YYYY.ZZZZ; </query> </hive> <ok to="end"/> <error to="kill"/> </action> <kill name="kill"> <message>${wf:errorMessage(wf:lastErrorNode())}</message> </kill> <end name="end"/> </workflow-app>
== ERROR MESSAGE ==
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Logging initialized using configuration in /data/hadoop/yarn/local/usercache/MY_USER_NAME/appcache/application_1487006380071_0351/container_e94_1487006380071_0351_01_000002/hive-log4j.properties FAILED: SemanticException MetaException(message:org.apache.hadoop.security.AccessControlException: Permission denied: user=MY_USER_NAME, access=EXECUTE, inode="/apps/hive/warehouse/DATABASE.db":hdfs:hdfs:d--------- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205) at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:307) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
Created ‎03-06-2017 12:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Problem solve by using Hive2
Created ‎03-05-2017 01:36 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Your "credential" section looks wrong, it should be something like this:
<property> <name>hcat.metastore.uri</name> <value>thrift://<host>:<port></value> </property> <property> <name>hcat.metastore.principal</name> <value>hive/<host>@<realm></value> </property>
On every node where Oozie client is installed you can find good examples for all Oozie actions including Hive action in "/usr/hdp/current/oozie-client/doc/examples". Check file called apps/hive/workflow.xml.security under "examples" and modify job.properties to provide your "realm" and other required parameters. Also, in case of hive2 action be sure to test using HS2 server running in binary transport mode. There were some bugs in http mode on kerberized cluster. This applies only to hive2 action, the hive action you are trying should work on both transport modes.
Created ‎03-06-2017 12:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thx for your reply. I solve the problem by converting the Oozie script to run Hive2.
Created ‎03-06-2017 12:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Problem solve by using Hive2