Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Problem with running profiling job on spark on cluster with kerberos

Problem with running profiling job on spark on cluster with kerberos

Rising Star

Hi Guys,

 

I am trying to run a spark job on the cluster with Kerberos. Unfortunately, I have some problem with token delegation.

My cluster:

CDH 5.7.4, CM 5.8.3, RHEL 7.2, Kerberos (AD - Windows Server 2012 (2003 compatibility mode)).

Spark 1.6

 

 

2016-11-23T16:24:10.591Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.logging.JobLogger - Log appender added for job Id 81
2016-11-23T16:24:10.591Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - ******** Processing Profiler request with Job ID 81 ********
2016-11-23T16:24:10.592Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - HADOOP_CONF_DIR = /opt/trifacta/conf/hadoop-site/
2016-11-23T16:24:10.593Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - HADOOP_USER_NAME = npdsvc.cloudera.trif
2016-11-23T16:24:10.593Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - SPARK_DIST_CLASSPATH = 
2016-11-23T16:24:10.593Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - java.class.path = /opt/trifacta/services/spark-job-server/server/build/libs/spark-job-server-bundle.jar:/opt/trifacta/conf/hadoop-site/:/opt/trifacta/services/spark-job-server/build/bundle/spark-assembly-1.6.1.jar:hadoop-deps/cdh-5.7/build/libs/cdh-5.7-bundle.jar:
2016-11-23T16:24:10.594Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - HADOOP_CONF_DIR = /opt/trifacta/conf/hadoop-site/
2016-11-23T16:24:10.613Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - fs.default.name = hdfs://nameservice1
2016-11-23T16:24:10.614Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Application configuration:
2016-11-23T16:24:10.614Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - topOfTree=/opt/trifacta
2016-11-23T16:24:10.614Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - kerberos.enabled=true
2016-11-23T16:24:10.615Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - aws.s3.enabled=false
2016-11-23T16:24:10.615Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - hdfs.pathsConfig.tempFiles=/trifacta/tempfiles
2016-11-23T16:24:10.615Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - hdfs.pathsConfig.libraries=/trifacta/libraries
2016-11-23T16:24:10.615Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - hdfs.pathsConfig.sparkEvenLogs=/trifacta/sparkeventlogs
2016-11-23T16:24:10.616Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - hdfs.pathsConfig.dictionaries=/trifacta/dictionaries
2016-11-23T16:24:10.616Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - spark-job-service.hiveDependenciesLocation=/opt/trifacta/services/data-service/build/dependencies/
2016-11-23T16:24:10.616Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - batchserver.logging.jobs.path=/opt/trifacta/logs/jobs
2016-11-23T16:24:10.617Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - spark-job-service.enableHistoryServer=false
2016-11-23T16:24:10.617Z com.trifacta.jobserver.logging.LogUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Spark config:
2016-11-23T16:24:10.617Z com.trifacta.jobserver.logging.LogUtil$$anonfun$logConfig$1 [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - spark.driver.maxResultSize = 2g
2016-11-23T16:24:10.618Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - HadoopPrincipal = Some()
2016-11-23T16:24:10.633Z com.trifacta.jobserver.JobHelper$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.JobHelper - Received job spec eyJpbnB1dCI6ImhkZnM6Ly9MU0dOUERITU4wMS5ub25wcm9kLmxvY2FsOjgwMjAvdHJpZmFjdGEvcXVlcnlSZXN1bHRzL2FkbWluQHRyaWZhY3RhLmxvY2FsL2N1c3RvbWVyLzYyLy5wcm9maWxlci9wcm9maWxlcklucHV0Lmpzb24iLCJmb3JtYXQiOiJqc29uIiwic2NoZW1hIjp7Im9yZGVyIjpbIklNU0kiLCJDT05UUkFDVF9FTkQiLCJDT05UUkFDVF9TVEFSVCIsIlNVQlNDUklCRVJfQUdFIiwiU1RBVFVTIiwiV0VCX0NIQVRfSUQiLCJDT1VOVFJZIiwiVElUTEUiLCJGSVJTVF9OQU1FIiwiTEFTVF9OQU1FIiwiU1NOIiwiQUREUkVTUyIsIkNJVFkiLCJTVEFURSIsIlpJUCIsIlBIT05FIiwiRU1BSUwiLCJUV0lUVEVSIiwiT0NDVVBBVElPTiJdLCJ0eXBlcyI6eyJJTVNJIjpbIkludGVnZXIiLHsicmVnZXhlcyI6WyJeLT8oPzowfFsxLTldWzAtOV17MCwxNX0pJCJdfV0sIkNPTlRSQUNUX0VORCI6WyJEYXRldGltZSIseyJyZWdleGVzIjpbIl4oMD9bMS05XXwxWzAtMl0pKD86IHwsfC98LSkoWzAtMl0/WzAtOV18MzB8MzEpKD86IHwsfC98LSkoXFxkezJ9KSQiXSwiZ3JvdXBMb2NzIjp7Im1vbnRoIjoxLCJkYXRlIjoyLCJ5ZWFyIjozfX1dLCJDT05UUkFDVF9TVEFSVCI6WyJEYXRldGltZSIseyJyZWdleGVzIjpbIl4oMD9bMS05XXwxWzAtMl0pKD86IHwsfC98LSkoWzAtMl0/WzAtOV18MzB8MzEpKD86IHwsfC98LSkoXFxkezJ9KSQiXSwiZ3JvdXBMb2NzIjp7Im1vbnRoIjoxLCJkYXRlIjoyLCJ5ZWFyIjozfX1dLCJTVUJTQ1JJQkVSX0FHRSI6WyJJbnRlZ2VyIix7InJlZ2V4ZXMiOlsiXi0/KD86MHxbMS05XVswLTldezAsMTV9KSQiXX1dLCJTVEFUVVMiOlsiU3RyaW5nIix7fV0sIldFQl9DSEFUX0lEIjpbIlN0cmluZyIse31dLCJDT1VOVFJZIjpbIlN0cmluZyIse31dLCJUSVRMRSI6WyJTdHJpbmciLHt9XSwiRklSU1RfTkFNRSI6WyJTdHJpbmciLHt9XSwiTEFTVF9OQU1FIjpbIlN0cmluZyIse31dLCJTU04iOlsiU1NOIix7InJlZ2V4ZXMiOlsiXigwMFsxLTldfDBbMS05XVxcZHw2NlswLTU3LTldfDZbMC01Ny05XVxcZHxbMS01Ny04XVxcZHsyfSkoWyAtXT8pKDBbMS05XXxbMS05XVxcZClcXDIoMDAwWzEtOV18MDBbMS05XVxcZHwwWzEtOV1cXGR7Mn18WzEtOV1cXGR7M30pJCJdfV0sIkFERFJFU1MiOlsiU3RyaW5nIix7fV0sIkNJVFkiOlsiU3RyaW5nIix7fV0sIlNUQVRFIjpbIlN0YXRlIix7InJlZ2V4ZXMiOlsiXihhbHxha3xhenxhcnxjYXxjb3xjdHxkZXxmbHxnYXxoaXxpZHxpbHxpbnxpYXxrc3xreXxsYXxtZXxtZHxtYXxtaXxtbnxtc3xtb3xtdHxuZXxudnxuaHxuanxubXxueXxuY3xuZHxvaHxva3xvcnxwYXxyaXxzY3xzZHx0bnx0eHx1dHx2dHx2YXx3YXx3dnx3aXx3eXxwcnxkY3x2aSkkIiwiXihhbGFiYW1hfGFsYXNrYXxhcml6b25hfGFya2Fuc2FzfGNhbGlmb3JuaWF8Y29sb3JhZG98Y29ubmVjdGljdXR8ZGVsYXdhcmV8ZmxvcmlkYXxnZW9yZ2lhfGhhd2FpaXxpZGFob3xpbGxpbm9pc3xpbmRpYW5hfGlvd2F8a2Fuc2FzfGtlbnR1Y2t5fGxvdWlzaWFuYXxtYWluZXxtYXJ5bGFuZHxtYXNzYWNodXNldHRzfG1pY2hpZ2FufG1pbm5lc290YXxtaXNzaXNzaXBwaXxtaXNzb3VyaXxtb250YW5hfG5lYnJhc2thfG5ldmFkYXxuZXcgaGFtcHNoaXJlfG5ldyBqZXJzZXl8bmV3IG1leGljb3xuZXcgeW9ya3xub3J0aCBjYXJvbGluYXxub3J0aCBkYWtvdGF8b2hpb3xva2xhaG9tYXxvcmVnb258cGVubnN5bHZhbmlhfHJob2RlIGlzbGFuZHxzb3V0aCBjYXJvbGluYXxzb3V0aCBkYWtvdGF8dGVubmVzc2VlfHRleGFzfHV0YWh8dmVybW9udHx2aXJnaW5pYXx3YXNoaW5ndG9ufHdlc3QgdmlyZ2luaWF8d2lzY29uc2lufHd5b21pbmd8cHVlcnRvIHJpY298ZGlzdHJpY3Qgb2YgY29sdW1iaWF8dmlyZ2luIGlzbGFuZHMpJCJdfV0sIlpJUCI6WyJaaXBjb2RlIix7InJlZ2V4ZXMiOlsiXihbXFxkXXs1fXxbXFxkXXs5fSkkIiwiXltcXGRdezV9LVtcXGRdezR9Il19XSwiUEhPTkUiOlsiUGhvbmUiLHsicmVnZXhlcyI6WyJeKD86KD86XFwrPzFcXHMqKD86W1xcLi1dXFxzKik/KT8oPzpcXChcXHMqKD86WzItOV0xWzAyLTldfFsyLTldWzAyLThdWzAtOV0pXFxzKlxcKXwoPzpbMi05XTFbMDItOV18WzItOV1bMDItOF0xfFsyLTldWzAyLThdWzAyLTldKSlcXHMqKD86Wy4tXVxccyopPykoPzpbMi05XTFbMDItOV18WzItOV1bMDItOV0xfFsyLTldWzAyLTldezJ9KVxccyooPzpbLi1dXFxzKik/KD86WzAtOV17NH0pKD86XFxzKig/OiN8eFxcLj98ZXh0XFwuP3xleHRlbnNpb24pXFxzKig/OlxcZCspKT8kIl19XSwiRU1BSUwiOlsiRW1haWxhZGRyZXNzIix7InJlZ2V4ZXMiOlsiXlthLXowLTkuISMkJSYnKisvPT9eX2B7fH1+LV0rQFthLXowLTkuLV0rKD86XFwuW2EtejAtOS1dKykqXFwuW2Etel17Mix9JCJdfV0sIlRXSVRURVIiOlsiU3RyaW5nIix7fV0sIk9DQ1VQQVRJT04iOlsiU3RyaW5nIix7fV19fSwidXNlckNvbnRleHQiOiJucGRzdmMuY2xvdWRlcmEudHJpZiIsImNvbW1hbmRzIjpbeyJwcm9maWxlci10eXBlIjoiaGlzdG9ncmFtIiwiY29sdW1uIjoiKiIsInBhcmFtcyI6eyJtYXhiaW5zIjoiMjAifSwib3V0cHV0IjoiaGRmczovL0xTR05QREhNTjAxLm5vbnByb2QubG9jYWw6ODAyMC90cmlmYWN0YS9xdWVyeVJlc3VsdHMvYWRtaW5AdHJpZmFjdGEubG9jYWwvY3VzdG9tZXIvNjIvLnByb2ZpbGVyL3Byb2ZpbGVyVmFsaWRWYWx1ZUhpc3RvZ3JhbXMuanNvbiJ9LHsicHJvZmlsZXItdHlwZSI6InR5cGUtY2hlY2siLCJjb2x1bW4iOiIqIiwicGFyYW1zIjp7fSwib3V0cHV0IjoiaGRmczovL0xTR05QREhNTjAxLm5vbnByb2QubG9jYWw6ODAyMC90cmlmYWN0YS9xdWVyeVJlc3VsdHMvYWRtaW5AdHJpZmFjdGEubG9jYWwvY3VzdG9tZXIvNjIvLnByb2ZpbGVyL3Byb2ZpbGVyVHlwZUNoZWNrSGlzdG9ncmFtcy5qc29uIn0seyJwcm9maWxlci10eXBlIjoic2FtcGxlcyIsImNvbHVtbiI6IioiLCJwYXJhbXMiOnt9LCJvdXRwdXQiOiJoZGZzOi8vTFNHTlBESE1OMDEubm9ucHJvZC5sb2NhbDo4MDIwL3RyaWZhY3RhL3F1ZXJ5UmVzdWx0cy9hZG1pbkB0cmlmYWN0YS5sb2NhbC9jdXN0b21lci82Mi8ucHJvZmlsZXIvcHJvZmlsZXJTYW1wbGVzLmpzb24ifV19
2016-11-23T16:24:10.634Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.util.KerberosTicketRenewer - Security enabled? true
2016-11-23T16:24:10.634Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.util.KerberosTicketRenewer - Current user = npdsvc.cloudera.trif@NONPROD.LOCAL (auth:KERBEROS)
2016-11-23T16:24:10.634Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.util.KerberosTicketRenewer - Login user = npdsvc.cloudera.trif@NONPROD.LOCAL (auth:KERBEROS)
2016-11-23T16:24:10.635Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.util.KerberosTicketRenewer - Checking and refreshing kerberos token for logged in user npdsvc.cloudera.trif@NONPROD.LOCAL (auth:KERBEROS)
2016-11-23T16:24:10.661Z org.apache.spark.deploy.yarn.SparkLaunchActor [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver - Optimize localization is true
2016-11-23T16:24:11.218Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver.util.JarDeployer - Source file /opt/trifacta/services/spark-job-server/build/bundle/spark-assembly-1.6.1.jar is NOT newer than destination file /trifacta/libraries/spark-assembly-1.6.1.jar. Source file with not be deployed
2016-11-23T16:24:11.241Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver.util.JarDeployer - Source file /opt/trifacta/services/spark-job-server/profiler/build/libs/profiler-bundle.jar is NOT newer than destination file /trifacta/libraries/profiler-bundle.jar. Source file with not be deployed
2016-11-23T16:24:11.250Z org.apache.spark.deploy.yarn.SparkLaunchActor [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver - Set spark.yarn.jar in SparkConf to hdfs://nameservice1/trifacta/libraries/spark-assembly-1.6.1.jar
2016-11-23T16:24:11.250Z org.apache.spark.deploy.yarn.SparkLaunchActor [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver - Set spark.yarn.user.jar in SparkConf to hdfs://nameservice1/trifacta/libraries/profiler-bundle.jar
2016-11-23T16:24:11.271Z org.apache.spark.deploy.yarn.SparkLaunchActor$ [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver - Reading hive dependencies from /opt/trifacta/services/data-service/build/dependencies
2016-11-23T16:24:11.277Z org.apache.spark.deploy.yarn.SparkLaunchActor [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver - ClientArguments: --name, Trifacta Profiler, --jar, hdfs://nameservice1/trifacta/libraries/profiler-bundle.jar, --class, com.trifacta.jobserver.profiler.spark.Profiler, --files, /tmp/spec_5389453854184812651.json#spec.json, --addJars, /opt/trifacta/services/data-service/build/dependencies/datanucleus-api-jdo-3.2.6.jar,/opt/trifacta/services/data-service/build/dependencies/datanucleus-core-3.2.10.jar,/opt/trifacta/services/data-service/build/dependencies/datanucleus-rdbms-3.2.9.jar,/opt/trifacta/services/data-service/build/dependencies/hive-jdbc-0.14.0.jar, --executor-cores, 1, --executor-memory, 1g, --driver-cores, 1, --driver-memory, 1g, --arg, eyJjb25maWciOnsiaGFkb29wSW1wZXJzb25hdGlvbiI6ImZhbHNlIiwia2VyYmVyb3MucHJpbmNpcGFsIjoibnBkc3ZjLmNsb3VkZXJhLnRyaWYiLCJrZXJiZXJvcy5lbmFibGVkIjoidHJ1ZSIsInNwYXJrLWpvYi1zZXJ2aWNlLmVuYWJsZUhpc3RvcnlTZXJ2ZXIiOiJmYWxzZSIsImhkZnMucGVybWlzc2lvbnMudXNlclVtYXNrIjoiMDI3IiwiYXdzLnMzLmJ1Y2tldC5uYW1lIjoiPFlPVVJfVkFMVUVfSEVSRT4iLCJiYXRjaHNlcnZlci5sb2dnaW5nLmpvYnMucGF0aCI6IiUodG9wT2ZUcmVlKXMvbG9ncy9qb2JzIiwidG9wT2ZUcmVlIjoiL29wdC90cmlmYWN0YSIsImF3cy5zMy5rZXkiOiI8WU9VUl9WQUxVRV9IRVJFPiIsImhkZnMucGF0aHNDb25maWcudGVtcEZpbGVzIjoiL3RyaWZhY3RhL3RlbXBmaWxlcyIsImhkZnMucGF0aHNDb25maWcubGlicmFyaWVzIjoiL3RyaWZhY3RhL2xpYnJhcmllcyIsInNwYXJrLWpvYi1zZXJ2aWNlLmhpdmVEZXBlbmRlbmNpZXNMb2NhdGlvbiI6IiUodG9wT2ZUcmVlKXMvc2VydmljZXMvZGF0YS1zZXJ2aWNlL2J1aWxkL2RlcGVuZGVuY2llcy8iLCJzcGFyay1qb2Itc2VydmljZS5lbnYuSEFET09QX1VTRVJfTkFNRSI6Im5wZHN2Yy5jbG91ZGVyYS50cmlmIiwiU1BBUktfSk9CX1NFUlZJQ0VfUE9SVCI6IjQwMDciLCJzcGFyay1qb2Itc2VydmljZS5zcGFya0ltcGVyc29uYXRpb25PbiI6ImZhbHNlIiwiYXdzLnMzLmVuYWJsZWQiOiJmYWxzZSIsInNwYXJrLnByb3BzLnNwYXJrLmRyaXZlci5tYXhSZXN1bHRTaXplIjoiMmciLCJoZGZzLnBhdGhzQ29uZmlnLnNwYXJrRXZlbkxvZ3MiOiIvdHJpZmFjdGEvc3BhcmtldmVudGxvZ3MiLCJrZXJiZXJvcy5rZXl0YWIiOiIvb3B0L3RyaWZhY3RhL2NvbmYvdHJpZmFjdGEua2V5dGFiIiwic3Bhcmstam9iLXNlcnZpY2Uub3B0aW1pemVMb2NhbGl6YXRpb24iOiJ0cnVlIiwiYXdzLnMzLmhhZG9vcEZTTW9kZSI6InMzYSIsImhkZnMucGF0aHNDb25maWcuZGljdGlvbmFyaWVzIjoiL3RyaWZhY3RhL2RpY3Rpb25hcmllcyIsImF3cy5zMy5zZWNyZXQiOiI8WU9VUl9WQUxVRV9IRVJFPiIsIndlYmFwcC5zdG9yYWdlUHJvdG9jb2wiOiJoZGZzIiwiaGRmcy5wZXJtaXNzaW9ucy5zeXN0ZW1VbWFzayI6IjAyNyJ9fQ==
2016-11-23T16:24:11.319Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver.util.RunAs - Impersonation on? false
2016-11-23T16:24:13.269Z org.apache.spark.deploy.yarn.SparkLaunchActor$$anonfun$2 [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver - Launched Spark Application with ID application_1479911564377_0015
2016-11-23T16:24:18.299Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Handling progress request for jobId 81
2016-11-23T16:24:18.300Z com.trifacta.jobserver.WebApi$$anonfun$2$$anonfun$apply$2 [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Got request to get progress info for entity application_1479911564377_0015
2016-11-23T16:24:23.354Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver.WebApi - Handling progress request for jobId 81
2016-11-23T16:24:23.355Z com.trifacta.jobserver.WebApi$$anonfun$2$$anonfun$apply$2 [SparkJobServer-akka.actor.default-dispatcher-7] INFO  com.trifacta.jobserver.WebApi - Got request to get progress info for entity application_1479911564377_0015
2016-11-23T16:24:28.398Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Handling progress request for jobId 81
2016-11-23T16:24:28.398Z com.trifacta.jobserver.WebApi$$anonfun$2$$anonfun$apply$2 [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Got request to get progress info for entity application_1479911564377_0015
2016-11-23T16:24:33.578Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Handling progress request for jobId 81
2016-11-23T16:24:33.579Z com.trifacta.jobserver.WebApi$$anonfun$2$$anonfun$apply$2 [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Got request to get progress info for entity application_1479911564377_0015
2016-11-23T16:24:38.621Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Handling progress request for jobId 81
2016-11-23T16:24:38.621Z com.trifacta.jobserver.WebApi$$anonfun$2$$anonfun$apply$2 [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.WebApi - Got request to get progress info for entity application_1479911564377_0015
2016-11-23T16:24:38.633Z com.trifacta.jobserver.ProgressUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.ProgressUtil - Final status of Application application_1479911564377_0015 = FAILED
2016-11-23T16:24:38.655Z com.trifacta.jobserver.ProgressUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.ProgressUtil - Application application_1479911564377_0015 failed or was killed. Error:
2016-11-23T16:24:38.655Z com.trifacta.jobserver.ProgressUtil$ [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.ProgressUtil - org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token can be issued only with kerberos or web authentication
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:7482)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:543)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getDelegationToken(AuthorizationProviderProxyClientProtocol.java:662)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDelegationToken(ClientNamenodeProtocolServerSideTranslatorPB.java:966)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)

	at org.apache.hadoop.ipc.Client.call(Client.java:1472)
	at org.apache.hadoop.ipc.Client.call(Client.java:1409)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy9.getDelegationToken(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDelegationToken(ClientNamenodeProtocolTranslatorPB.java:914)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
	at com.sun.proxy.$Proxy10.getDelegationToken(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getDelegationToken(DFSClient.java:1081)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getDelegationToken(DistributedFileSystem.java:1452)
	at org.apache.hadoop.fs.FileSystem.collectDelegationTokens(FileSystem.java:541)
	at org.apache.hadoop.fs.FileSystem.addDelegationTokens(FileSystem.java:519)
	at org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2181)
	at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:140)
	at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
	at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
	at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:206)
	at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315)
	at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1.apply(RDD.scala:1129)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
	at org.apache.spark.rdd.RDD.treeAggregate(RDD.scala:1127)
	at org.apache.spark.sql.execution.datasources.json.InferSchema$.infer(InferSchema.scala:65)
	at org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$4.apply(JSONRelation.scala:114)
	at org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$4.apply(JSONRelation.scala:109)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.execution.datasources.json.JSONRelation.dataSchema$lzycompute(JSONRelation.scala:109)
	at org.apache.spark.sql.execution.datasources.json.JSONRelation.dataSchema(JSONRelation.scala:108)
	at org.apache.spark.sql.sources.HadoopFsRelation.schema$lzycompute(interfaces.scala:636)
	at org.apache.spark.sql.sources.HadoopFsRelation.schema(interfaces.scala:635)
	at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:37)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:109)
	at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:244)
	at com.trifacta.jobserver.profiler.spark.Profiler.profile(Profiler.scala:163)
	at com.trifacta.jobserver.profiler.spark.Profiler.run(Profiler.scala:127)
	at com.trifacta.jobserver.profiler.spark.Profiler$.delayedEndpoint$com$trifacta$jobserver$profiler$spark$Profiler$1(Profiler.scala:67)
	at com.trifacta.jobserver.profiler.spark.Profiler$delayedInit$body.apply(Profiler.scala:33)
	at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
	at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
	at scala.App$$anonfun$main$1.apply(App.scala:76)
	at scala.App$$anonfun$main$1.apply(App.scala:76)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
	at scala.App$class.main(App.scala:76)
	at com.trifacta.jobserver.profiler.spark.Profiler$.main(Profiler.scala:33)
	at com.trifacta.jobserver.profiler.spark.Profiler.main(Profiler.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:542)

2016-11-23T16:24:38.692Z com.trifacta.jobserver.logging.ServerLogging$class [SparkJobServer-akka.actor.default-dispatcher-8] INFO  com.trifacta.jobserver.SparkJobServer - Deleted error file /trifacta/tempfiles/application_1479911564377_0015.error

Anu suggestion very welcome.