Member since
09-11-2015
41
Posts
48
Kudos Received
14
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2177 | 02-03-2017 09:39 AM | |
1916 | 01-31-2017 12:41 PM | |
2936 | 01-20-2017 12:38 PM | |
4239 | 01-18-2017 01:26 PM | |
7407 | 01-11-2017 02:35 PM |
11-09-2022
12:06 AM
Default Page Size seems to be 200 on most APIs. Use query parameters pageSize and startIndex to page through the results
... View more
12-29-2021
01:10 AM
I believe that in Ranger 1.2.0 the property xasecure.audit.provider.summary.enabled is called Audit provider summary enabled (checkbox to tick) in Advanced ranger-hdfs-audit in HDFS service in Ambari.
... View more
08-04-2021
06:31 AM
In my case I had to restart HiveServer2 services on nodes after I had connected the hosts to the domain (using sssd service).
... View more
04-13-2017
12:13 PM
3 Kudos
When trying to add a policy that has many resource paths to Ranger using the API it can fail with the error Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Out of range value for column 'sort_order' at row 1
Error Code: 1264
Call: INSERT INTO x_policy_resource_map (ADDED_BY_ID, CREATE_TIME, sort_order, resource_id, UPDATE_TIME, UPD_BY_ID, value) VALUES (?, ?, ?, ?, ?, ?, ?)
bind => [7 parameters bound]
Query: InsertObjectQuery(XXPolicyResourceMap [XXDBBase={createTime={Thu Apr 13 11:42:38 UTC 2017} updateTime={Thu Apr 13 11:42:39 UTC 2017} addedByUserId={1} updatedByUserId={1} } id=null, resourceId=43, value=/tmp/129, order=128]) This is caused by a limit in Ranger policies that can only contain a maximum of 128 resource paths in a single policy. The work-around would be to split the policy in to two or more policies each containing less that 128 resource paths.
... View more
Labels:
01-24-2017
02:29 PM
@Terry Stebbens Thanksa lot again for all ur help . In HDP sandbox 2.5 , while testing ranger functionalities, I got I need hive user access.but am not sure what is default hive user credentials. Can yo help me in this please. I am trying to achieve BEST PRACTICES FOR HDFS AUTHORIZATION Having a federated authorization model may create a challenge for security administrators looking to plan a security model for HDFS. After Apache Ranger and Hadoop have been installed, we recommend administrators to implement the following steps:
Change HDFS umask to 077 Identify directory which can be managed by Ranger policies Identify the directories that can be managed by Ranger policies We recommend that permission for application data folders (/apps/hive, /apps/Hbase) as well as any custom data folders be managed through Apache Ranger. The HDFS native permissions for these directories need to be restrictive. This can be done through changing permissions in HDFS using chmod. Example: $ hdfs dfs -chmod -R 000 /apps/hive $ hdfs dfs -chown -R hdfs:hdfs /apps/hive $ hdfs dfs -ls /apps/hive Found 1 items d——— – hdfs hdfs 0 2015-11-30 08:01 /apps/hive/warehouse After changing umask value 077. Its not allow me to do any operation as root user. its looking for hive user. can u guide me please Error: at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:297)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:350)
chmod: changing permissions of '/apps/hive': Permission denied. user=root is not the owner of inode=hive
http://hortonworks.com/blog/best-practices-in-hdfs-authorization-with-apache-ranger/
... View more
01-12-2017
04:43 AM
Thanks @Terry Stebbens and @Ian Roberts, I have changed to yarn.scheduler.capacity.maximum-am-resource-percent=0.4 and its working fine.
... View more
12-01-2016
05:06 PM
@Terry Stebbens Thanks Terry for quick response. We are using third party CA. (Not the self signed ones). Currently while generating the CSR we given Common Name = {hostname} $hostname yields : abc-xyz-001.CompanyName.COM Instead when we give CN = *.CompanyName.COM, do we need to to get a domain set up in DNS to handle this? Thanks, Arpan
... View more
01-25-2017
09:55 AM
Hi, you need to give your Ubuntu host a domain name which you then add as the FQDN. For the second part, you need to copy your id_rsa file you created and paste it into the SSh private key box. (It is either the id_rsa or id_rsa.pub, i cant remember), as shown in the image above.
... View more
10-27-2016
06:03 AM
yes it will not work at all with hive cli , better you use beeline, thanks!
... View more
10-24-2016
01:13 PM
Okay.. Was hoping this feature could be or will be avalible in Resource Based. One case could be data in HDFS which only should be allowed to acces data based on location or a time perioed.
... View more