Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

java.lang.IllegalStateException(RPC channel is closed.)

Highlighted

java.lang.IllegalStateException(RPC channel is closed.)

Explorer

I have installed 5.7 Cloudera cluster and changed the one single paramameter :

 

hive.execution.engine to Spark

 

Then tried to execute an example query which resulted in the following error:

 

16/04/29 03:14:52 ERROR status.SparkJobMonitor: Status: SENT
16/04/29 03:14:52 INFO log.PerfLogger: </PERFLOG method=SparkRunJob start=1461917631790 end=1461917692802 duration=61012 from=org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor>
16/04/29 03:14:52 ERROR exec.Task: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
java.lang.IllegalStateException: RPC channel is closed.
at com.google.common.base.Preconditions.checkState(Preconditions.java:145)
at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:276)
at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:259)
at org.apache.hive.spark.client.SparkClientImpl$ClientProtocol.cancel(SparkClientImpl.java:523)
at org.apache.hive.spark.client.SparkClientImpl.cancel(SparkClientImpl.java:187)
at org.apache.hive.spark.client.JobHandleImpl.cancel(JobHandleImpl.java:62)
at org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobRef.cancelJob(RemoteSparkJobRef.java:54)
at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:119)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1774)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1531)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1311)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1120)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

16/04/29 03:14:52 ERROR exec.Task: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
java.lang.IllegalStateException: RPC channel is closed.
at com.google.common.base.Preconditions.checkState(Preconditions.java:145)
at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:276)
at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:259)
at org.apache.hive.spark.client.SparkClientImpl$ClientProtocol.cancel(SparkClientImpl.java:523)
at org.apache.hive.spark.client.SparkClientImpl.cancel(SparkClientImpl.java:187)
at org.apache.hive.spark.client.JobHandleImpl.cancel(JobHandleImpl.java:62)
at org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobRef.cancelJob(RemoteSparkJobRef.java:54)
at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:119)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1774)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1531)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1311)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1120)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
16/04/29 03:14:52 ERROR ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
16/04/29 03:14:52 INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1461917618369 end=1461917692836 duration=74467 from=org.apache.hadoop.hive.ql.Driver>
16/04/29 03:14:52 INFO ql.Driver: Completed executing command(queryId=hive_20160429031313_ef0fd500-f203-4f36-a1db-49b7b3efaf71); Time taken: 74.467 seconds
16/04/29 03:14:52 INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
16/04/29 03:14:52 INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1461917692838 end=1461917692845 duration=7 from=org.apache.hadoop.hive.ql.Driver>
16/04/29 03:14:52 ERROR operation.Operation: Error running hive query:
org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:374)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:180)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

4 REPLIES 4
Highlighted

Re: java.lang.IllegalStateException(RPC channel is closed.) - second request - please help ?

Explorer
Can someone give me a clue regarding the nature of the problem here with having a simple
Hive query execute a query on Spark ?

Do i have to extend the Spark Job Monitor time ? If so, how do i do that ?

Any suggestions out there ?


6/05/05 10:15:00 INFO log.PerfLogger: <PERFLOG method=SparkSubmitToRunning from=org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor> 16/05/05 10:16:01 INFO status.SparkJobMonitor: Job hasn't been submitted after 61s. Aborting it. 16/05/05 10:16:01 ERROR status.SparkJobMonitor: Status: SENT 16/05/05 10:16:01 INFO log.PerfLogger: </PERFLOG method=SparkRunJob start=1462461300288 end=1462461361295 duration=61007 from=org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor> 16/05/05 10:16:01 ERROR exec.Task: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)' java.lang.IllegalStateException: RPC channel is closed. at com.google.common.base.Preconditions.checkState(Preconditions.java:145) at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:276) at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:259) at org.apache.hive.spark.client.SparkClientImpl$ClientProtocol.cancel(SparkClientImpl.java:523)

 

 

 

Highlighted

Re: java.lang.IllegalStateException(RPC channel is closed.)

New Contributor

 I am getting this same error on CDH 5.7 

Highlighted

Re: java.lang.IllegalStateException(RPC channel is closed.)

New Contributor
I'm getting the same issue on 5.8
Highlighted

Re: java.lang.IllegalStateException(RPC channel is closed.)

Expert Contributor

Same for me in CDH 5.10 and latest Livy. Everything else is ok as long as Hive is not involved.

Don't have an account?
Coming from Hortonworks? Activate your account here