Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Ranger permissions to create temporary function

avatar
Expert Contributor

With Ranger permissions for Hive, I can create a permanent function, but not a temporary function.

I have Ranger installed with Hive Doas=false (queries run as hive.) Only Hive policies setup with no HDFS policies. There are two policies for this user and this database for

1) All Tables in this database granting all permissions

2) All Functions in this database granting all permissions

With the TEMPORARY keyword, getting an access denied error. Remove that and no problem.

Are there specific permissions needed for temp functions? Maybe in some other database?

-- Create Temporary Function - No Joy!
0: jdbc:hive2://evdp-lnx-hrt014.aginity.local> CREATE TEMPORARY FUNCTION FN_UNIQUE_NUMBER
0: jdbc:hive2://evdp-lnx-hrt014.aginity.local> AS 'com.screamingweasel.amp.hive.udf.UniqueNumberGenerator'
0: jdbc:hive2://evdp-lnx-hrt014.aginity.local> USING JAR 'hdfs:///tmp/custom-hive-udf-2.4.1.jar';
Error: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [batyr_amp_admin] does not have [CREATE] privilege on [amp_unique_number] (state=42000,code=40000)

-- Create permanent function no problem
0: jdbc:hive2://evdp-lnx-hrt014.aginity.local> CREATE FUNCTION FN_UNIQUE_NUMBER
0: jdbc:hive2://evdp-lnx-hrt014.aginity.local> AS 'com.screamingweasel.amp.hive.udf.UniqueNumberGenerator'
0: jdbc:hive2://evdp-lnx-hrt014.aginity.local> USING JAR 'hdfs:///tmp/custom-hive-udf-2.4.1.jar';
INFO  : converting to local hdfs:///tmp/custom-hive-udf-2.4.1.jar
INFO  : Added [/tmp/68037a68-6f5b-40ef-8073-fefdaa6319be_resources/custom-hive-udf-2.4.1.jar] to class path
INFO  : Added resources: [hdfs:///tmp/custom-hive-udf-2.4.1.jar]
1 ACCEPTED SOLUTION

avatar
Expert Contributor

Nothing like writing something down to make you find the answer! This has an open jira to correct in Ranger, where the database name is not being passed correctly to determine permissions for temporary functions. This is not the case for permanent functions.

The current workaround is to specify "*" for the database name in the function policy.

View solution in original post

6 REPLIES 6

avatar
Expert Contributor

Nothing like writing something down to make you find the answer! This has an open jira to correct in Ranger, where the database name is not being passed correctly to determine permissions for temporary functions. This is not the case for permanent functions.

The current workaround is to specify "*" for the database name in the function policy.

avatar
New Contributor

@jbarnett Thank you for this information. It seems I had the same problem and the wildcard "*" helped me. But I cannot find the jira that you are talking about. Could you tell the number of this jira or a direct link ? Thanks.

,

@jbarnett Thank you for this information. It seems I had the same problem and the wildcard "*" helped me. But I cannot find the jira that you are talking about. Could you tell the number of this jira or a direct link ? Thanks.

avatar

@jbarnett I´ve enable '*' permission in database, the JAR file and directory has ALL permission and directory /user/hive has full permission, but I´m still in trouble to create temporary function, can you share more detailed your workaround ?

Error: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [XXXXXX] does not have [CREATE] privilege on [ZZZZZ] (state=42000,code=40000)

avatar
Explorer

@Rodrigo Rondena I am having the same problem. There is full access but still I get an error with having a temporary function. Please let me know what else I can try.

I was also not able to find the JIRA that @jbarnett mentioned.

,

@Rodrigo RondenaWere you able to find a solution? I have the same problem.

avatar

This Ranger Jira is actually dependent on a hive Jira in order for it to be fixed.

avatar
Explorer

So I identified that this is a bug which they will fix only in HDP 3. Looks like there is work around which worked on HDP 2.6.0 but it stopped working in 2.6.1. I upgraded my stack to 2.6.2 and it works fine now.