Support Questions

Find answers, ask questions, and share your expertise

org.apache.hadoop.security.authentication.client.AuthenticationException

avatar
Explorer

Hi, I am working on spark-streaming project. My hadoop cluster has Kerberos enabled. I have added some properties while creating hiveContext-

 

@transient private var instance: HiveContext = _

 

 

  def getHiveContext(sparkContext: SparkContext, properties: Properties): HiveContext = {

    synchronized {

 

      val configuration = new Configuration

      configuration.addResource("/etc/hadoop/conf/hdfs-site.xml")

      UserGroupInformation.setConfiguration(configuration)

      UserGroupInformation.getCurrentUser.setAuthenticationMethod(AuthenticationMethod.KERBEROS)

 

      if (instance == null) {

        System.setProperty("hive.metastore.uris", properties.getProperty("hive.metastore.uris"));

        if (properties.getProperty("kerberosSecurity").toBoolean) {

          System.setProperty("hive.metastore.sasl.enabled", "true")

          System.setProperty("hive.metastore.kerberos.keytab.file", sparkContext.getConf.get("spark.yarn.keytab"))

          System.setProperty("hive.security.authorization.enabled", "false")

          System.setProperty("hive.metastore.kerberos.principal", properties.getProperty("hive.metastore.kerberos.principal"))

          System.setProperty("hive.metastore.execute.setugi", "true")

        }

 

        UserGroupInformation.loginUserFromKeytabAndReturnUGI(

          properties.getProperty("hadoop.kerberos.principal"), sparkContext.getConf.get("spark.yarn.keytab"))

          .doAs(new PrivilegedExceptionAction[HiveContext]() {

            @Override

            def run(): HiveContext = {

              instance = new HiveContext(sparkContext)

 

              instance

            }

          })

 

      }

 

      UserGroupInformation.loginUserFromKeytabAndReturnUGI(

        properties.getProperty("hadoop.kerberos.principal"), sparkContext.getConf.get("spark.yarn.keytab"))

        .doAs(new PrivilegedExceptionAction[HiveContext]() {

          @Override

          def run(): HiveContext = {

            instance

          }

        })

 

    }

  }

 

Program run fine. Spark can access hive to save the data into hive in every 5 mins. But after about 24hr or 30hr my spark program gives error regarding kerbores authentication. 

 

Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: org.apache.hadoop.security.token.SecretManager$InvalidToken: token (kms-dt owner=svc-dev, renewer=yarn, realUser=, issueDate=1518875523811, maxDate=1519480323811, sequenceNumber=85825, masterKeyId=1062) can't be found in cache
at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBalancingKMSClientProvider.java:216)
at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
at org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)
at org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1510)
at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:328)
at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:322)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:322)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:783)
at parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:407)
at parquet.hadoop.ParquetFileReader$2.call(ParquetFileReader.java:238)
... 5 more

 

 

 Can anybody help me out from this problem?

 

2 REPLIES 2

avatar
Champion

you might want to use the keytab to avoid expiration of the ticket in the kerberos .

check if you have valid kerberos ticket ? 

 

 

https://www.cloudera.com/documentation/enterprise/5-5-x/topics/cdh_sg_kadmin_kerberos_keytab.html

avatar
Explorer

I have already provided the principal and keytab. 

UserGroupInformation.loginUserFromKeytabAndReturnUGI(

      properties.getProperty("hadoop.kerberos.principal"),

sparkContext.getConf.get("spark.yarn.keytab")

)