Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to use Apache Spark to query Hive table with Kerberos?

How to use Apache Spark to query Hive table with Kerberos?

New Contributor

I am attempting to use Scala with Apache Spark locally to query Hive table which is secured with Kerberos. I have no issues connecting and querying the data programmatically without Spark. However, the problem comes when I try to connect and query in Spark.

 

My code when run locally without spark:

 

Class.forName("org.apache.hive.jdbc.HiveDriver")
    System.setProperty("kerberos.keytab", keytab)
    System.setProperty("kerberos.principal", keytab)
    System.setProperty("java.security.krb5.conf", krb5.conf)
    System.setProperty("java.security.auth.login.config", jaas.conf)
    val conf = new Configuration
    conf.set("hadoop.security.authentication", "Kerberos")
    UserGroupInformation.setConfiguration(conf)
    UserGroupInformation.createProxyUser("user", UserGroupInformation.getLoginUser)
    UserGroupInformation.loginUserFromKeytab(user, keytab)
    UserGroupInformation.getLoginUser.checkTGTAndReloginFromKeytab()
    if (UserGroupInformation.isLoginKeytabBased) {
      UserGroupInformation.getLoginUser.reloginFromKeytab()
    }
    else if (UserGroupInformation.isLoginTicketBased) UserGroupInformation.getLoginUser.reloginFromTicketCache()
    val con = DriverManager.getConnection("jdbc:hive://hdpe-hive.company.com:10000", user, password)
    val ps = con.prepareStatement("select * from table limit 5").executeQuery();

 

 

Does anyone know how I could include the keytab, krb5.conf and jaas.conf into my Spark initialization function so that I am able to authenticate with Kerberos to get the TGT?

 

My Spark initialization function:

 

conf = new SparkConf().setAppName("mediumData")
      .setMaster(numCores)
      .set("spark.driver.host", "localhost")
      .set("spark.ui.enabled","true") //enable spark UI
      .set("spark.sql.shuffle.partitions",defaultPartitions)
    sparkSession = SparkSession.builder.config(conf).enableHiveSupport().getOrCreate()

 

 

I do not have files such as hive-site.xml, core-site.xml.

Thank you!

 

Don't have an account?
Coming from Hortonworks? Activate your account here