Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Livy Job failing after upgrading cluster to HDP 2.6.3

Highlighted

Livy Job failing after upgrading cluster to HDP 2.6.3

Explorer

I have upgraded my cluster to HDP 2.6.3 recently with having spark 2.2.0 and livy 0.4.0. My livy job is every failing every time with Rsc driver exception. I analysed logs and found that host not able to add the Jar file from the given location that why it is failing.

Error: WARN DefaultPromise: An exception was thrown by org.apache.livy.rsc.Utils$2.operationComplete() java.util.concurrent.RejectedExecutionException: event executor terminated at io.netty.util.concurrent.SingleThreadEventExecutor.reject(SingleThreadEventExecutor.java:796) at io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(SingleThreadEventExecutor.java:336) at io.netty.util.concurrent.SingleThreadEventExecutor.addTask(SingleThreadEventExecutor.java:329) at io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:739) at io.netty.util.concurrent.AbstractScheduledEventExecutor.schedule(AbstractScheduledEventExecutor.java:190) at io.netty.util.concurrent.AbstractScheduledEventExecutor.schedule(AbstractScheduledEventExecutor.java:134) at io.netty.util.concurrent.AbstractEventExecutorGroup.schedule(AbstractEventExecutorGroup.java:49) at org.apache.livy.rsc.driver.RSCDriver.setupIdleTimeout(RSCDriver.java:238) at org.apache.livy.rsc.driver.RSCDriver.access$100(RSCDriver.java:70) at org.apache.livy.rsc.driver.RSCDriver$2.onSuccess(RSCDriver.java:220) at org.apache.livy.rsc.driver.RSCDriver$2.onSuccess(RSCDriver.java:216) at org.apache.livy.rsc.Utils$2.operationComplete(Utils.java:108) at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507) at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481) at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420) at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:104) at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82) at io.netty.channel.AbstractChannel$CloseFuture.setClosed(AbstractChannel.java:1004) at io.netty.channel.AbstractChannel$AbstractUnsafe.doClose0(AbstractChannel.java:633) at io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:611) at io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:554) at io.netty.channel.DefaultChannelPipeline$HeadContext.close(DefaultChannelPipeline.java:1236) at io.netty.channel.AbstractChannelHandlerContext.invokeClose(AbstractChannelHandlerContext.java:619) at io.netty.channel.AbstractChannelHandlerContext.close(AbstractChannelHandlerContext.java:603) at io.netty.channel.ChannelDuplexHandler.close(ChannelDuplexHandler.java:73) at io.netty.channel.AbstractChannelHandlerContext.invokeClose(AbstractChannelHandlerContext.java:619) at io.netty.channel.AbstractChannelHandlerContext.close(AbstractChannelHandlerContext.java:603) at io.netty.channel.AbstractChannelHandlerContext.close(AbstractChannelHandlerContext.java:460) at io.netty.channel.DefaultChannelPipeline.close(DefaultChannelPipeline.java:949) at io.netty.channel.AbstractChannel.close(AbstractChannel.java:194) at org.apache.livy.rsc.rpc.Rpc.close(Rpc.java:307) at org.apache.livy.rsc.driver.RSCDriver.shutdownServer(RSCDriver.java:309) at org.apache.livy.rsc.driver.RSCDriver.shutdown(RSCDriver.java:133) at org.apache.livy.rsc.driver.RSCDriver.handle(RSCDriver.java:396) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.livy.rsc.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:130) at org.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:77) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267) at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) at java.lang.Thread.run(Thread.java:745)

13 REPLIES 13
Highlighted

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Explorer

This is really very urgent, any help would be highly appreciated!

Highlighted

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Can you share the details of how Livy job is launched? This exception appears to come from Livy side, did the Spark job actually get launched? What are the errors on Spark executor side?

Highlighted

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Explorer

@vshukla

Livy job is being launched by passing call() to an application. In the earlier version of HDP(2.6.2 - Livy 0.3.0 version & Spark 2.1.1 version) it was working fine, and since I have upgraded to HDP 2.6.3 it is throwing this exception, I guess SparkContext is not being created by the Livy. Below are the Yarn logs, There has been no logs created at the Spark side.

11:47:06 INFO YarnRMClient: Registering the ApplicationMaster 11:47:06 INFO RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]... 11:47:06 INFO RequestHedgingRMFailoverProxyProvider: Found active RM [rm2] 11:47:06 INFO YarnAllocator: Will request 5 executor container(s), each with 5 core(s) and 1 MB memory (including 00 MB of overhead) 11:47:06 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@ 11:47:06 INFO YarnAllocator: Submitted 15 unlocalized container requests. 11:47:06 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 3000, initial allocation : 200) intervals 11:47:07 INFO AMRMClientImpl: Received new token for : 11:47:07 INFO YarnAllocator: Launching container container_e60_151002 on host for executor with ID 1 11:47:07 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 11:47:07 INFO YarnAllocator: Received 1 containers from YARN, launching executors on 1 of them. 11:47:07 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 11:47:07 INFO ContainerManagementProtocolProxy: Opening proxy : 454 11:47:10 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) 11:47:10 INFO BlockManagerMasterEndpoint: Registering block manager with 2.8 GB RAM, BlockManagerId 11:47:11 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 11:47:11 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done 11:47:11 INFO SparkEntries: Spark context finished initialization in 6347ms 11:47:11 INFO SparkEntries: Created Spark session (with Hive support). 11:47:17 INFO SparkEntries: Created HiveContext. 11:47:17 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 11:47:19 INFO SparkContext: Added file /tmp/__livy__/mdm.jar at spark://:488/files/mdm.jar with timestamp 151 11:47:19 INFO Utils: Copying /tmp/__livy__/mdm.jar to /hadoop/yarn/local/usercache/livy/appcache/application_15/spark-ae212c0/userFiles-516f/mdm.jar 11:47:21 INFO SparkContext: Added JAR hdfs://8020/tmp/simple-project/mdm.jar at hdfs://:8020/tmp/simple-project/mdm.jar with timestamp 151 11:47:21 INFO RSCDriver: Received bypass job request f340a3ad 11:47:21 INFO AbstractConnector: Stopped Spark@6f6{HTTP/1.1,[http/]}{0.0.0.0:0} 11:47:21 INFO SparkUI: Stopped Spark web UI at http:// 11:47:21 INFO YarnAllocator: Driver requested a total number of 0 executor(s). 11:47:21 INFO YarnClusterSchedulerBackend: Shutting down all executors 11:47:21 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down 11:47:21 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 11:47:21 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 11:47:21 INFO MemoryStore: MemoryStore cleared 11:47:21 INFO BlockManager: BlockManager stopped 11:47:21 INFO BlockManagerMaster: BlockManagerMaster stopped 11:47:21 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 11:47:21 INFO SparkContext: Successfully stopped SparkContext 11:47:21 INFO PythonInterpreter: Shutting down process 11:47:22 INFO SparkContext: SparkContext already stopped. 11:47:23 INFO PythonInterpreter: process has been shut down 11:47:23 INFO SparkContext: SparkContext already stopped. 11:47:23 WARN DefaultPromise: An exception was thrown by org.apache.livy.rsc.Utils$2.operationComplete()

Highlighted

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Explorer

@vshukla Can you please provide your inputs on this?

Highlighted

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Can you turn up Livy log level and post the log in the livy-server?

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Explorer

@vshukla

Below are the Livy server logs:

application_1 INFO InteractiveSessionManager: Registering new session 85 application_1 INFO LineBufferedStream: stdout: Warning: Master yarn-cluster is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead. application_1 INFO LineBufferedStream: stdout: application_151 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable application_1 INFO LineBufferedStream: stdout: application_1 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. application_1 INFO LineBufferedStream: stdout: application_1 INFO RMProxy: Connecting to ResourceManager at XXXX:8050 application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Requesting a new application from cluster with 25 NodeManagers application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (10752 MB per container) application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Will allocate AM container, with 3456 MB memory including 384 MB overhead application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Setting up container launch context for our AM application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Setting up the launch environment for our AM container application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Preparing resources for our AM container application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Use hdfs cache file as spark.yarn.archive for HDP, hdfsCacheFile:hdfs://XXXX/spark2/spark2-hdp-yarn-archive.tar.gz application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Source and destination file systems are the same. Not copying hdfs://XXXX/spark2/spark2-hdp-yarn-archive.tar.gz application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/livy2-server/rsc-jars/netty-all-4.0.29.Final.jar >hdfs://XXXXXX/user/livy/.sparkStaging/application_1/netty-all-4.0.29.Final.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/livy2-server/rsc-jars/livy-rsc-0.4.0.2.6.3.0-235.jar -> XXXX/user/livy/.sparkStaging/application_1/livy-rsc-0.4.0.2.6.3.0-235.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/livy2-server/rsc-jars/livy-api-0.4.0.2.6.3.0-235.jar -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/livy-api-0.4.0.2.6.3.0-235.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/livy2-server/repl_2.11-jars/livy-core_2.11-0.4.0.2.6.3.0-235.jar -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/livy-core_2.11-0.4.0.2.6.3.0-235.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/livy2-server/repl_2.11-jars/livy-repl_2.11-0.4.0.2.6.3.0-235.jar -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/livy-repl_2.11-0.4.0.2.6.3.0-235.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/livy2-server/repl_2.11-jars/commons-codec-1.9.jar -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/commons-codec-1.9.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/spark2-client/jars/datanucleus-api-jdo-3.2.6.jar -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/datanucleus-api-jdo-3.2.6.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/spark2-client/jars/datanucleus-rdbms-3.2.9.jar -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/datanucleus-rdbms-3.2.9.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/spark2-client/jars/datanucleus-core-3.2.10.jar -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/datanucleus-core-3.2.10.jar application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/etc/spark2/2.6.3.0-235/0/hive-site.xml -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/hive-site.xml application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/spark2-client/R/lib/sparkr.zip#sparkr -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/sparkr.zip application_1 INFO LineBufferedStream: stdout: application_1 INFO Client: Uploading resource file:/usr/hdp/current/spark2-client/python/lib/pyspark.zip -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/pyspark.zip INFO LineBufferedStream: stdout: INFO Client: Uploading resource file:/usr/hdp/current/spark2-client/python/lib/py4j-0.10.4-src.zip -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/py4j-0.10.4-src.zip INFO LineBufferedStream: stdout: INFO Client: Uploading resource file:/tmp/spark-6d8338e8-578d-4d77-8bc5-55090776bc38/__spark_conf__563125507133746340.zip -> hdfs://XXXXXXXXX/user/livy/.sparkStaging/application_1/__spark_conf__.zip INFO LineBufferedStream: stdout: INFO SecurityManager: Changing view acls to: livy INFO LineBufferedStream: stdout: INFO SecurityManager: Changing modify acls to: livy INFO LineBufferedStream: stdout: INFO SecurityManager: Changing view acls groups to: INFO LineBufferedStream: stdout: INFO SecurityManager: Changing modify acls groups to: INFO LineBufferedStream: stdout: INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(livy); groups with view permissions: Set(); users with modify permissions: Set(livy); groups with modify permissions: Set() INFO LineBufferedStream: stdout: INFO Client: Submitting application application_1 to ResourceManager INFO LineBufferedStream: stdout: INFO YarnClientImpl: Submitted application application_1 INFO LineBufferedStream: stdout: INFO Client: Application report for application_1 (state: ACCEPTED) INFO LineBufferedStream: stdout: INFO Client: INFO LineBufferedStream: stdout: client token: N/A INFO LineBufferedStream: stdout: diagnostics: Scheduler has assigned a container for AM, waiting for AM container to be launched INFO LineBufferedStream: stdout: ApplicationMaster host: N/A INFO LineBufferedStream: stdout: ApplicationMaster RPC port: -1 INFO LineBufferedStream: stdout: queue: default INFO LineBufferedStream: stdout: start time: 1512031435221 INFO LineBufferedStream: stdout: final status: UNDEFINED INFO LineBufferedStream: stdout: tracking URL: http://XXXX INFO LineBufferedStream: stdout: user: livy INFO LineBufferedStream: stdout: INFO ShutdownHookManager: Shutdown hook called INFO LineBufferedStream: stdout: INFO ShutdownHookManager: Deleting directory /tmp/spark-6d8338e8-578d-4d77-8bc5-55090776bc38 16:44:18 INFO RSCClient: Received result for cedfhhdk 16:44:18 INFO InteractiveSession: Interactive session 85 created [appid: application_1, owner: null, proxyUser: None, state: idle, kind: shared, info: {driverLogUrl=http://XXXXXXX/}] 16:44:24 INFO RSCClient: Received result for 271sjas 16:44:24 WARN RpcDispatcher: Received error message:java.lang.IllegalArgumentException: Invalid kind: null org.apache.livy.sessions.Kind$.apply(Kind.scala:48) org.apache.livy.repl.ReplDriver.createWrapper(ReplDriver.scala:91) org.apache.livy.rsc.driver.RSCDriver.handle(RSCDriver.java:409) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:498) org.apache.livy.rsc.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:130) org.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:77) io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293) io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267) io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) java.lang.Thread.run(Thread.java:745). 16:44:24 WARN RpcDispatcher: Received error message:java.util.NoSuchElementException: 24b22129-5d23-4e5c-ae05-8f4ee9c9b59c org.apache.livy.rsc.driver.RSCDriver.handle(RSCDriver.java:454) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:498) org.apache.livy.rsc.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:130) org.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:77) io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293) io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267) io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) java.lang.Thread.run(Thread.java:745). 16:44:24 ERROR SessionServlet$: internal error java.util.concurrent.ExecutionException: org.apache.livy.rsc.rpc.RpcException: java.util.NoSuchElementException: 24b22129-5d23-4e5c-ae05-8f4ee9c9b59c org.apache.livy.rsc.driver.RSCDriver.handle(RSCDriver.java:454) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:498) org.apache.livy.rsc.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:130) org.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:77) io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293) io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267) io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) java.lang.Thread.run(Thread.java:745) at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37) at org.apache.livy.server.interactive.InteractiveSession.jobStatus(InteractiveSession.scala:550) at org.apache.livy.server.interactive.InteractiveSessionServlet$anonfun$22$anonfun$apply$19.apply(InteractiveSessionServlet.scala:228) at org.apache.livy.server.interactive.InteractiveSessionServlet$anonfun$22$anonfun$apply$19.apply(InteractiveSessionServlet.scala:226) at org.apache.livy.server.interactive.SessionHeartbeatNotifier$anonfun$withViewAccessSession$1.apply(SessionHeartbeat.scala:69) at org.apache.livy.server.interactive.SessionHeartbeatNotifier$anonfun$withViewAccessSession$1.apply(SessionHeartbeat.scala:67) at org.apache.livy.server.SessionServlet.doWithSession(SessionServlet.scala:221) at org.apache.livy.server.SessionServlet.withViewAccessSession(SessionServlet.scala:205) at org.apache.livy.server.interactive.InteractiveSessionServlet.org$apache$livy$server$interactive$SessionHeartbeatNotifier$super$withViewAccessSession(InteractiveSessionServlet.scala:40) at org.apache.livy.server.interactive.SessionHeartbeatNotifier$class.withViewAccessSession(SessionHeartbeat.scala:67) at org.apache.livy.server.interactive.InteractiveSessionServlet.withViewAccessSession(InteractiveSessionServlet.scala:40) at org.apache.livy.server.interactive.InteractiveSessionServlet$anonfun$22.apply(InteractiveSessionServlet.scala:226) at org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$liftAction(ScalatraBase.scala:270) at org.scalatra.ScalatraBase$anonfun$invoke$1.apply(ScalatraBase.scala:265) at org.scalatra.ScalatraBase$anonfun$invoke$1.apply(ScalatraBase.scala:265) at org.scalatra.ApiFormats$class.withRouteMultiParams(ApiFormats.scala:178) at org.apache.livy.server.JsonServlet.withRouteMultiParams(JsonServlet.scala:39) at org.scalatra.ScalatraBase$class.invoke(ScalatraBase.scala:264) at org.scalatra.ScalatraServlet.invoke(ScalatraServlet.scala:49) at org.scalatra.ScalatraBase$anonfun$runRoutes$1$anonfun$apply$8.apply(ScalatraBase.scala:240) at org.scalatra.ScalatraBase$anonfun$runRoutes$1$anonfun$apply$8.apply(ScalatraBase.scala:238) at scala.Option.flatMap(Option.scala:170) at org.scalatra.ScalatraBase$anonfun$runRoutes$1.apply(ScalatraBase.scala:238) at org.scalatra.ScalatraBase$anonfun$runRoutes$1.apply(ScalatraBase.scala:237) at scala.collection.immutable.Stream.flatMap(Stream.scala:442) at org.scalatra.ScalatraBase$class.runRoutes(ScalatraBase.scala:237) at org.scalatra.ScalatraServlet.runRoutes(ScalatraServlet.scala:49) at org.scalatra.ScalatraBase$class.runActions$1(ScalatraBase.scala:163) at org.scalatra.ScalatraBase$anonfun$executeRoutes$1.apply$mcV$sp(ScalatraBase.scala:175) at org.scalatra.ScalatraBase$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175) at org.scalatra.ScalatraBase$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175) at org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$cradleHalt(ScalatraBase.scala:193) at org.scalatra.ScalatraBase$class.executeRoutes(ScalatraBase.scala:175) at org.scalatra.ScalatraServlet.executeRoutes(ScalatraServlet.scala:49) at org.scalatra.ScalatraBase$anonfun$handle$1.apply$mcV$sp(ScalatraBase.scala:113) at org.scalatra.ScalatraBase$anonfun$handle$1.apply(ScalatraBase.scala:113) at org.scalatra.ScalatraBase$anonfun$handle$1.apply(ScalatraBase.scala:113) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80) at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49) at org.scalatra.DynamicScope$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71) at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49) at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59) at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49) at org.scalatra.ScalatraBase$class.handle(ScalatraBase.scala:111) at org.scalatra.ScalatraServlet.org$scalatra$servlet$ServletBase$super$handle(ScalatraServlet.scala:49) at org.scalatra.servlet.ServletBase$class.handle(ServletBase.scala:43) at org.apache.livy.server.SessionServlet.org$scalatra$MethodOverride$super$handle(SessionServlet.scala:39) at org.scalatra.MethodOverride$class.handle(MethodOverride.scala:28) at org.apache.livy.server.SessionServlet.org$scalatra$GZipSupport$super$handle(SessionServlet.scala:39) at org.scalatra.GZipSupport$anonfun$handle$1.apply$mcV$sp(GZipSupport.scala:34) at org.scalatra.GZipSupport$anonfun$handle$1.apply(GZipSupport.scala:19) at org.scalatra.GZipSupport$anonfun$handle$1.apply(GZipSupport.scala:19) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80) at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49) at org.scalatra.DynamicScope$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71) at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49) at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59) at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49) at org.scalatra.GZipSupport$class.handle(GZipSupport.scala:18) at org.apache.livy.server.interactive.InteractiveSessionServlet.org$scalatra$servlet$FileUploadSupport$super$handle(InteractiveSessionServlet.scala:40) at org.scalatra.servlet.FileUploadSupport$class.handle(FileUploadSupport.scala:93) at org.apache.livy.server.interactive.InteractiveSessionServlet.handle(InteractiveSessionServlet.scala:40) at org.scalatra.ScalatraServlet.service(ScalatraServlet.scala:54) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1669) at org.apache.livy.server.CsrfFilter.doFilter(CsrfFilter.scala:42) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:499) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.livy.rsc.rpc.RpcException: java.util.NoSuchElementException: 24b22129-5d23-4e5c-ae05-8f4ee9c9b59c org.apache.livy.rsc.driver.RSCDriver.handle(RSCDriver.java:454) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:498) org.apache.livy.rsc.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:130) org.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:77) io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293) io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267) io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) java.lang.Thread.run(Thread.java:745) at org.apache.livy.rsc.rpc.RpcDispatcher.handleError(RpcDispatcher.java:155) at org.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:83) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244) at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) ... 1 more INFO InteractiveSession: Stopping InteractiveSession 85... WARN RpcDispatcher: [ClientProtocol] Closing RPC channel with 1 outstanding RPCs. INFO InteractiveSession: Stopped InteractiveSession 85.

All the warnings has been taken care, still getting this error.

Highlighted

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Explorer

@vshukla any input?

Highlighted

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Expert Contributor

I've got the same problem, also after upgrading to HDP 2.6.3!

Highlighted

Re: Livy Job failing after upgrading cluster to HDP 2.6.3

Super Guru

Did you reboot? Did you do the upgrade with Ambari? livy and spark should be updated. make sure you submit jobs to livy2. i would uninstall existing livy, reboot, install livy2

Don't have an account?
Coming from Hortonworks? Activate your account here