Support Questions

Find answers, ask questions, and share your expertise

Ranger HDFS Policy fails if the Number of HDFS PAHT is 128.

avatar
Contributor

We want to restrict the access to only specified tables( 200 tables) using Ranger Hive/HDFS policy.

To Achieve this, we created HDFS and HIVE policy using REST API. In the HDFS policy, we individually list out all the HDFS paths in the hdfs policy i.e. /apps/hive/warehouse/test.db/table1,/apps/hive/warehouse/test.db/table2, so that user can not bypass the Ranger Hive policies and access all the tables using the Hive CLI or HDFS client.

Problem is, If the number of HDFS Path is more than 128, then REST API call for HDFS policy is failing with below error:

{"statusCode":1,"msgDesc":"Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20131113-a7346c6): org.eclipse.persistence.exceptions.DatabaseException\nInternal Exception: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Out of range value for column 'sort_order' at row 1\nError Code: 1264\nCall: INSERT INTO x_policy_resource_map (ADDED_BY_ID, CREATE_TIME, sort_order, resource_id, UPDATE_TIME, UPD_BY_ID, value) VALUES (?, ?, ?, ?, ?, ?, ?)\n\tbind => [7 parameters bound]\nQuery: InsertObjectQuery(XXPolicyResourceMap [XXDBBase={createTime={Wed Oct 12 22:00:32 EDT 2016} updateTime={Thu Oct 13 13:56:38 EDT 2016} addedByUserId={1} updatedByUserId={1} } id=null, resourceId=167, value=/apps/hive/warehouse/dev_tbls.db/abcd, order=128])"}

1 ACCEPTED SOLUTION

avatar
Super Collaborator
@Amit Kumar Agarwal

Which version of HDP are you are using? Please check the datatype of the column "sort_order" in x_policy_resource_map table and if it tiny_int, please alter it to int and try?

View solution in original post

1 REPLY 1

avatar
Super Collaborator
@Amit Kumar Agarwal

Which version of HDP are you are using? Please check the datatype of the column "sort_order" in x_policy_resource_map table and if it tiny_int, please alter it to int and try?