Hi,
When we try to run the spark-submit on our kerberos cluster, we get below mentioned error.
spark-submit -v \
--master yarn-client \
--jars $myDependencyJarFiles \
--conf spark.default.parallelism=12 \
--conf spark.driver.memory=4g \
--conf spark.executor.memory=4g \
--conf spark.yarn.security.tokens.hive.enabled=false \
--conf spark.yarn.security.tokens.hbase.enabled=false \
--conf spark.yarn.executor.memoryOverhead=2048 \
--conf spark.kryoserializer.buffer.max=512m \
--conf spark.yarn.appMasterEnv.MASTER=yarn-client \
--conf spark.yarn.appMasterEnv.HADOOP_HOME=/opt/cloudera/parcels/CDH/lib/hadoop \
--conf spark.yarn.appMasterEnv.HADOOP_MAPRED_HOME=/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce \
--conf spark.hadoop.fs.hdfs.impl.disable.cache=true \
--keytab user.keytab \
--principal user@abc.domain.com
--class com.test.spark.spark.testclass \
ERROR:
y ERROR - 17/10/23 15:54:48 WARN yarn.ExecutorDelegationTokenUpdater: Error while trying to update credentials, will try again in 1 hour
23-10-2017 15:54:48 GMT test-hourly ERROR - 17/10/23 15:54:48 ERROR util.Utils: Uncaught exception in thread main
23-10-2017 15:54:48 GMT test-hourly ERROR - java.lang.StackOverflowError
Also, I have tried by running the spark-submit with principle & keytab and without principle & keytab, but the error was same.
Can anybody please help me on the same, if the error is related with having a client mode on kerberos cluster or if you can help me on the possible fix.
Thanks,
Amit