Member since
02-29-2016
108
Posts
213
Kudos Received
14
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2006 | 08-18-2017 02:09 PM | |
3507 | 06-16-2017 08:04 PM | |
3248 | 01-20-2017 03:36 AM | |
8807 | 01-04-2017 03:06 AM | |
4391 | 12-09-2016 08:27 PM |
02-03-2017
05:32 PM
3 Kudos
I have a test environment with OpenLDAP and MIT KDC as backend directory services. I tried to use it to test Nifi authorization through Ranger and running into an issue where the user name seems to not matching correctly. Here is my setup HDF 2.1.1.0, Nifi 1.1.0 and Ranger 0.6.2 Cluster installed with all HDF components except Storm and kafka Cluster Kerberized with MIT KDC Credentials in OpenLDAP Ranger sync with OpenLDAP Ranger Nifi policy created for a user with all permissions. I could get to Nifi login page and login with the credentials from OpenLDAP, but then it complains about not have enough access Looking at the audit log, the user name get logged in Ranger is hadoopadmin@FIELD.HORTONWORKS.COM rather than hadoopadmin, it seems the KDC principal name get used here I haven't setup identity mapping and the values are empty now. What values should I use to get the username mapped correctly? Thanks,
... View more
Labels:
- Labels:
-
Apache NiFi
-
Apache Ranger
01-20-2017
03:33 AM
@Sergey Soldatov Add it in ambari at the end of "Advanced zeppelin-env" -> "zeppelin_env_content" worked perfectly.
... View more
01-19-2017
03:06 AM
2 Kudos
@Sergey Soldatov It works after following your steps. It would be nice to figure out how to do step 3 in ambari if possible, always worry some later ambari update would wipe the change out.
thanks a lot for your help!
... View more
01-18-2017
11:09 PM
2 Kudos
@dvillarreal I read your comments on https://community.hortonworks.com/articles/38348/ranger-is-not-allowing-access-to-knox-resources-wh.html If you look at my topology, it contains the group section just like in your post. The only difference I could think of is that I use open-ldap as the directory server rather than AD. And the values for object class and attributes are different than AD. <param>
<name>main.ldapRealm.authorizationEnabled</name>
<value>true</value>
</param>
<param>
<name>main.ldapRealm.groupSearchBase</name>
<value>ou=Groups,dc=field,dc=hortonworks,dc=com</value>
</param>
<param>
<name>main.ldapRealm.groupObjectClass</name>
<value>posixgroup</value>
</param>
<param>
<name>main.ldapRealm.groupIdAttribute</name>
<value>cn</value>
</param>
... View more
01-18-2017
10:59 PM
@Sergey Soldatov Are you running into the same problem on a secured cluster or non-secured one?
... View more
01-18-2017
10:58 PM
2 Kudos
@Josh Elser HBase is up and running fine. I can create table in both hbase shell and sqlline.py hbase(main):001:0> create 'my_table1', {NAME =>'cf1'}, {NAME =>'cf2'}
0 row(s) in 2.9560 seconds
=> Hbase::Table - my_table1
hbase(main):002:0> put 'my_table1', 'rowkey01', 'cf1:c1', 'test value'
0 row(s) in 0.6850 seconds
hbase(main):003:0> get 'my_table1', 'rowkey01'
COLUMN CELL
cf1:c1 timestamp=1484780190152, value=test value
1 row(s) in 0.0780 seconds
... View more
01-18-2017
08:28 PM
@Josh Elser HBase is running fine. I cold pull the table list and look into "atlas_titan" tables content [root@qwang-hdp5 logs]# hbase shell
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 1.1.2.2.5.3.0-37, rcb8c969d1089f1a34e9df11b6eeb96e69bcf878d, Tue Nov 29 18:48:22 UTC 2016
hbase(main):001:0> list
TABLE
ATLAS_ENTITY_AUDIT_EVENTS
SYSTEM.CATALOG
SYSTEM.FUNCTION
SYSTEM.SEQUENCE
SYSTEM.STATS
atlas_titan
driver_dangerous_event
my_table
8 row(s) in 0.5710 seconds
hbase(main):003:0> scan 'atlas_titan'
... View more
01-18-2017
08:23 PM
@lmccay As you mentioned, the log does indicate the group search is no returning the right group 17/01/18 15:31:26 ||5725e8ba-938d-40a7-86b9-64642ad8903f|audit|WEBHDFS|hr1|||authentication|uri|/gateway/default/webhdfs/v1/hr/exempt?op=LISTSTATUS|success|
17/01/18 15:31:26 ||5725e8ba-938d-40a7-86b9-64642ad8903f|audit|WEBHDFS|hr1|||authentication|uri|/gateway/default/webhdfs/v1/hr/exempt?op=LISTSTATUS|success|Groups: []
How do I config knox to do group lookup? Don't see anything in my topology related to group lookup <topology>
<gateway>
<provider>
<role>authentication</role>
<name>ShiroProvider</name>
<enabled>true</enabled>
<param>
<name>sessionTimeout</name>
<value>30</value>
</param>
<param>
<name>main.ldapRealm</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
</param>
<!-- changes for AD/user sync -->
<param>
<name>main.ldapContextFactory</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapContextFactory</value>
</param>
<!-- main.ldapRealm.contextFactory needs to be placed before other main.ldapRealm.contextFactory* entries -->
<param>
<name>main.ldapRealm.contextFactory</name>
<value>$ldapContextFactory</value>
</param>
<!-- AD url -->
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldap://qwang-kdc-ldap.field.hortonworks.com:389</value>
</param>
<!-- system user -->
<param>
<name>main.ldapRealm.contextFactory.systemUsername</name>
<value>cn=admin,dc=field,dc=hortonworks,dc=com</value>
</param>
<!-- pass in the password using the alias created earlier -->
<param>
<name>main.ldapRealm.contextFactory.systemPassword</name>
<value>password</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
<value>simple</value>
</param>
<param>
<name>urls./**</name>
<value>authcBasic</value>
</param>
<!-- AD groups of users to allow -->
<param>
<name>main.ldapRealm.searchBase</name>
<value>ou=Users,dc=field,dc=hortonworks,dc=com</value>
</param>
<param>
<name>main.ldapRealm.userObjectClass</name>
<value>person</value>
</param>
<param>
<name>main.ldapRealm.userSearchAttributeName</name>
<value>uid</value>
</param>
<!-- changes needed for group sync-->
<param>
<name>main.ldapRealm.authorizationEnabled</name>
<value>true</value>
</param>
<param>
<name>main.ldapRealm.groupSearchBase</name>
<value>ou=Groups,dc=field,dc=hortonworks,dc=com</value>
</param>
<param>
<name>main.ldapRealm.groupObjectClass</name>
<value>posixgroup</value>
</param>
<param>
<name>main.ldapRealm.groupIdAttribute</name>
<value>cn</value>
</param>
</provider>
<provider>
<role>identity-assertion</role>
<name>Default</name>
<enabled>true</enabled>
</provider>
<provider>
<role>authorization</role>
<name>XASecurePDPKnox</name>
<enabled>true</enabled>
</provider>
</gateway>
<service>
<role>NAMENODE</role>
<url>hdfs://{{namenode_host}}:{{namenode_rpc_port}}</url>
</service>
<service>
<role>JOBTRACKER</role>
<url>rpc://{{rm_host}}:{{jt_rpc_port}}</url>
</service>
<service>
<role>WEBHDFS</role>
<url>http://{{namenode_host}}:{{namenode_http_port}}/webhdfs</url>
</service>
<service>
<role>WEBHCAT</role>
<url>http://{{webhcat_server_host}}:{{templeton_port}}/templeton</url>
</service>
<service>
<role>OOZIE</role>
<url>http://{{oozie_server_host}}:{{oozie_server_port}}/oozie</url>
</service>
<service>
<role>WEBHBASE</role>
<url>http://{{hbase_master_host}}:{{hbase_master_port}}</url>
</service>
<service>
<role>HIVE</role>
<url>http://{{hive_server_host}}:{{hive_http_port}}/{{hive_http_path}}</url>
</service>
<service>
<role>RESOURCEMANAGER</role>
<url>http://{{rm_host}}:{{rm_port}}/ws</url>
</service>
</topology>
... View more
01-18-2017
03:58 PM
2 Kudos
HDP 2.5 secured cluster with Zeppelin and HBase installed. I could connect to Phoenix using the sqlline.py utility using the following parameter. /usr/hdp/current/phoenix-client/bin/sqlline.py <zk1>,<zk2>,<zk3>:2181:/hbase-secure:hbase@DOMAIN.COM:/etc/security/keytabs/hbase.headless.keytab
However when I try to use similar parameter for jdbc conn string in phoenix, I got null
org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=1, exceptions:
Wed Jan 18 15:13:11 UTC 2017, RpcRetryingCaller{globalStartTime=1484752390908, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.io.IOException: Broken pipe
class org.apache.zeppelin.interpreter.InterpreterException
The jdbc interpreter settings for Phoenix are Do I need to create phoenixuser principal and use that keytab instead? and what is phoenix.user and phoenix.password?
... View more
Labels:
- Labels:
-
Apache Phoenix
-
Apache Zeppelin