Created 07-20-2016 09:02 AM
I was reading on Ranger's documentation but it looks like that Apache Spark (within a Hadoop-YARN cluster) is not included among the list of supported services. My questions are:
(1) Is it possible to still use Ranger to grant permissions on who can use Spark?
(2) Are there other tools / frameworks / strategies I can employ to secure my Spark cluster?
(3) If Ranger allows it or if there is such a tool / framework / strategy, can I also do fine-grain authorization like database, table, column-level permission for Spark SQL?
Created 07-20-2016 09:06 AM
currenly ranger does not provide direct support for spark but yes you can configure ranger plugin for bottom layer like hive and hdfs.
Created 07-20-2016 09:06 AM
currenly ranger does not provide direct support for spark but yes you can configure ranger plugin for bottom layer like hive and hdfs.
Created 07-20-2016 09:16 AM
As per latest documentation Spark is currently not supported with ranger. There is no spark plugin available for ranger