18/01/26 10:00:33 INFO RSCDriver: Connecting to: hadoop-masternode3:10000 18/01/26 10:00:33 INFO RSCDriver: Starting RPC server... 18/01/26 10:00:33 INFO RpcServer: Connected to the port 10001 18/01/26 10:00:33 WARN RSCConf: Your hostname, hadoop-masternode3, resolves to a loopback address, but we couldn't find any external IP address! 18/01/26 10:00:33 WARN RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address. 18/01/26 10:00:34 INFO RSCDriver: Received job request 2c5221ee-f2f0-4fbf-9fed-7be70c4a0e77 18/01/26 10:00:34 INFO RSCDriver: SparkContext not yet up, queueing job request. 18/01/26 10:00:36 INFO SparkEntries: Starting Spark context... 18/01/26 10:00:36 INFO SparkContext: Running Spark version 2.2.0.2.6.3.0-235 18/01/26 10:00:37 INFO SparkContext: Submitted application: livy-session-16 18/01/26 10:00:37 INFO SecurityManager: Changing view acls to: livy,admin 18/01/26 10:00:37 INFO SecurityManager: Changing modify acls to: livy,admin 18/01/26 10:00:37 INFO SecurityManager: Changing view acls groups to: 18/01/26 10:00:37 INFO SecurityManager: Changing modify acls groups to: 18/01/26 10:00:37 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(livy, admin); groups with view permissions: Set(); users with modify permissions: Set(livy, admin); groups with modify permissions: Set() 18/01/26 10:00:37 INFO Utils: Successfully started service 'sparkDriver' on port 33521. 18/01/26 10:00:37 INFO SparkEnv: Registering MapOutputTracker 18/01/26 10:00:37 INFO SparkEnv: Registering BlockManagerMaster 18/01/26 10:00:37 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 18/01/26 10:00:37 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 18/01/26 10:00:37 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-463c0665-95bc-4952-94eb-3895a1a9562f 18/01/26 10:00:37 INFO MemoryStore: MemoryStore started with capacity 366.3 MB 18/01/26 10:00:37 INFO SparkEnv: Registering OutputCommitCoordinator 18/01/26 10:00:37 INFO log: Logging initialized @4818ms 18/01/26 10:00:37 INFO Server: jetty-9.3.z-SNAPSHOT 18/01/26 10:00:37 INFO Server: Started @4894ms 18/01/26 10:00:37 INFO AbstractConnector: Started ServerConnector@7311860f{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 18/01/26 10:00:37 INFO Utils: Successfully started service 'SparkUI' on port 4040. 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2500388f{/jobs,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@e44b3fa{/jobs/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@46b88943{/jobs/job,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@706570b5{/jobs/job/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4b1f2146{/stages,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@76e94b0{/stages/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6f6d209d{/stages/stage,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@679937c4{/stages/stage/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5277f0b1{/stages/pool,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@29081b11{/stages/pool/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3eaaf1b5{/storage,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@59c550f4{/storage/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@484ead3b{/storage/rdd,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6bf88f0b{/storage/rdd/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@717a092e{/environment,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@f3de347{/environment/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@60166519{/executors,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@357261c0{/executors/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@64016d6{/executors/threadDump,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@596959f4{/executors/threadDump/json,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@66f3ea59{/static,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5c04b5d4{/,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4a918333{/api,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@69986538{/jobs/job/kill,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5bb60c4d{/stages/stage/kill,null,AVAILABLE,@Spark} 18/01/26 10:00:37 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.22:4040 18/01/26 10:00:37 INFO SparkContext: Added file file:/etc/spark2/2.6.3.0-235/0/hive-site.xml at spark://192.168.1.22:33521/files/hive-site.xml with timestamp 1516957237867 18/01/26 10:00:37 INFO Utils: Copying /etc/spark2/2.6.3.0-235/0/hive-site.xml to /tmp/spark-c5ae78c2-705a-4062-8f8a-97dd18138603/userFiles-01ea5699-8b1d-496a-b4d6-be8afa8bf709/hive-site.xml 18/01/26 10:00:38 INFO Utils: Using initial executors = 1, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances 18/01/26 10:00:38 INFO RMProxy: Connecting to ResourceManager at hadoop-masternode1.bikw.sqlag/192.168.1.20:8050 18/01/26 10:00:38 ERROR SparkContext: Error initializing SparkContext. 18/01/26 10:00:38 INFO AbstractConnector: Stopped Spark@7311860f{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 18/01/26 10:00:38 INFO SparkUI: Stopped Spark web UI at http://192.168.1.22:4040 18/01/26 10:00:38 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered! 18/01/26 10:00:38 INFO YarnClientSchedulerBackend: Stopped 18/01/26 10:00:38 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 18/01/26 10:00:38 ERROR Utils: Uncaught exception in thread pool-2-thread-1 java.lang.NullPointerException at org.apache.spark.network.shuffle.ExternalShuffleClient.close(ExternalShuffleClient.java:140) at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1485) at org.apache.spark.SparkEnv.stop(SparkEnv.scala:90) at org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1944) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1317) at org.apache.spark.SparkContext.stop(SparkContext.scala:1943) at org.apache.spark.SparkContext.(SparkContext.scala:587) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516) at org.apache.spark.SparkContext.getOrCreate(SparkContext.scala) at org.apache.livy.rsc.driver.SparkEntries.sc(SparkEntries.java:51) at org.apache.livy.rsc.driver.SparkEntries.sparkSession(SparkEntries.java:72) at org.apache.livy.repl.AbstractSparkInterpreter.postStart(AbstractSparkInterpreter.scala:67) at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:92) at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:67) at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:67) at org.apache.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:308) at org.apache.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:67) at org.apache.livy.repl.Session$$anonfun$1.apply(Session.scala:127) at org.apache.livy.repl.Session$$anonfun$1.apply(Session.scala:121) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 18/01/26 10:00:38 INFO SparkContext: Successfully stopped SparkContext 18/01/26 10:00:38 WARN SparkEntries: SparkSession is not supported ERROR: org.apache.hadoop.security.authorize.AuthorizationException: User: livy is not allowed to impersonate admin 18/01/26 10:00:38 INFO DiskBlockManager: Shutdown hook called 18/01/26 10:00:38 INFO ShutdownHookManager: Shutdown hook called 18/01/26 10:00:38 INFO ShutdownHookManager: Deleting directory /tmp/spark-c5ae78c2-705a-4062-8f8a-97dd18138603/userFiles-01ea5699-8b1d-496a-b4d6-be8afa8bf709 18/01/26 10:00:38 INFO ShutdownHookManager: Deleting directory /tmp/spark-c5ae78c2-705a-4062-8f8a-97dd18138603