Member since
09-06-2016
35
Posts
2
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2676 | 03-06-2017 12:52 PM | |
2635 | 03-01-2017 09:46 PM |
02-27-2017
01:34 PM
I want to run a very simple Oozie workflow with a Hive action on a kerberized cluster. The problem is that Hive is using my credential and not the Hive-user as it is doing through Hive View.
If I change my access in Ranger for "/apps/..." then the Oozie workflow is working fine.
But we don't want personal account to have access for "/apps/..." folder
How is it possible to achieve do a Hive action where don't have access to "/apps"..." folder on HDFS? == WORKFLOW.XML == <?xml version="1.0" encoding="UTF-8" standalone="no"?>
<workflow-app xmlns="uri:oozie:workflow:0.5" name="oozie_hive_kerberos_test">
<credentials>
<credential name="hcat" type="hcat">
<property>
<name>hcat.metastore.principal</name>
<value>hive/_HOST@<host>.com</value>
</property>
<property>
<name>hcat.metastore.uri</name>
<value>thrift://<host>.com:9083</value>
</property>
</credential>
</credentials>
<start to="hive"/>
<action cred="hcat" name="hive">
<hive xmlns="uri:oozie:hive-action:0.6">
<job-tracker>${resourceManager}</job-tracker>
<name-node>${nameNode}</name-node>
<query>
use XXXXX;
drop table if exists YYYY.ZZZZ;
</query>
</hive>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>${wf:errorMessage(wf:lastErrorNode())}</message>
</kill>
<end name="end"/>
</workflow-app>
== ERROR MESSAGE == SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Logging initialized using configuration in /data/hadoop/yarn/local/usercache/MY_USER_NAME/appcache/application_1487006380071_0351/container_e94_1487006380071_0351_01_000002/hive-log4j.properties
FAILED: SemanticException MetaException(message:org.apache.hadoop.security.AccessControlException: Permission denied: user=MY_USER_NAME, access=EXECUTE, inode="/apps/hive/warehouse/DATABASE.db":hdfs:hdfs:d---------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:307)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Oozie
02-23-2017
02:09 PM
Thx for your reply. I am using HDP 2.5 out of the box. Which I guess is Ambari 2.4.2. I am having a hard time to found out what kind of upgrades is included in Ambari 2.5. Do you have a link and a idea when Ambari 2.5 is going to be released?
... View more
02-23-2017
01:13 PM
Thx for some nice articles.
At the moment I am using Oozie View from HDP 2.5. There is many things which don't work which is very frustrating. Your screenshots is indicating that you are using Hue or something else. Could you tell me which tool you are using? https://oozie.apache.org/docs/4.2.0/WorkflowFunctionalSpec.html
... View more
02-09-2017
04:34 PM
I am unable to connect with a JDBC driver from a Windows PC to Hive with Kerberos. Everything is working fine with a ODBC connection. But that is not a option in this case. The connection string is jdbc:hive2://XXX.YYY.com:10000/default;principal=hive/XXX.YYY.com@YYYY.com;saslQop=auth-conf And the error which is recive from Hive's log is: 2017-02-09 16:12:21,254 ERROR [HiveServer2-Handler-Pool: Thread-151963]: server.TThreadPoolServer (TThreadPoolServer.java:run(297)) - Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:609)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:606)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1704)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:606)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
... 10 more I guess this is something to do with a kerberos ticket which is not recive by Hive. Link to JDBC file https://github.com/timveil/hive-jdbc-uber-jar
... View more
Labels:
- Labels:
-
Apache Hive
10-24-2016
01:13 PM
Okay.. Was hoping this feature could be or will be avalible in Resource Based. One case could be data in HDFS which only should be allowed to acces data based on location or a time perioed.
... View more
10-24-2016
09:49 AM
Thx @Terry Stebbens, would this also enable "Policy conditions" option?
... View more
10-24-2016
08:39 AM
Hi, When I login in the Sandbox 2.5 (VMWare). Ranger don't contain any option for "Deny" or "Policy Condition" only through "Tag based..". In the documentation a screendump and description is showed with Hive and "Deny" condition. Link: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_security/content/about_ranger_policies.html Questions 1) Is there anything that which need to be enable to get this to work? 2) Is "Policy Condition" possible in Resource-Based Policy or only in "Tag based.." / Anders
... View more
Labels:
10-20-2016
03:30 PM
2 Kudos
I had a hard time finding a way to add a tag/traits in Atlas by using the REST API.
Here is a solution: POST http://{YOUR IP ADRESS}:21000/api/atlas/entities/{GUID FOR ENTITY}/traits/ BODY {"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct","typeName":"PII","values":{}} curl -X POST -H "Content-Type: application/json" -H "Authorization: Basic YWRtaW46YWRtaW4=" -H "Cache-Control: no-cache" -d '{"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct","typeName":"PII","values":{}}' "http://192.168.255.128:21000/api/atlas/entities/d5dcb483-d2fc-4544-8368-6ef56321efdb/traits/"
... View more
Labels:
- Labels:
-
Apache Atlas
10-20-2016
01:37 PM
Thx... This means that security based on tags of individual files or folders in HDFS can't be solve at the moment? Correct?
... View more
- « Previous
-
- 1
- 2
- Next »