Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

java.lang.IllegalStateException(RPC channel is closed.)

avatar
Explorer

I have installed 5.7 Cloudera cluster and changed the one single paramameter :

 

hive.execution.engine to Spark

 

Then tried to execute an example query which resulted in the following error:

 

16/04/29 03:14:52 ERROR status.SparkJobMonitor: Status: SENT
16/04/29 03:14:52 INFO log.PerfLogger: </PERFLOG method=SparkRunJob start=1461917631790 end=1461917692802 duration=61012 from=org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor>
16/04/29 03:14:52 ERROR exec.Task: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
java.lang.IllegalStateException: RPC channel is closed.
at com.google.common.base.Preconditions.checkState(Preconditions.java:145)
at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:276)
at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:259)
at org.apache.hive.spark.client.SparkClientImpl$ClientProtocol.cancel(SparkClientImpl.java:523)
at org.apache.hive.spark.client.SparkClientImpl.cancel(SparkClientImpl.java:187)
at org.apache.hive.spark.client.JobHandleImpl.cancel(JobHandleImpl.java:62)
at org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobRef.cancelJob(RemoteSparkJobRef.java:54)
at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:119)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1774)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1531)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1311)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1120)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

16/04/29 03:14:52 ERROR exec.Task: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
java.lang.IllegalStateException: RPC channel is closed.
at com.google.common.base.Preconditions.checkState(Preconditions.java:145)
at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:276)
at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:259)
at org.apache.hive.spark.client.SparkClientImpl$ClientProtocol.cancel(SparkClientImpl.java:523)
at org.apache.hive.spark.client.SparkClientImpl.cancel(SparkClientImpl.java:187)
at org.apache.hive.spark.client.JobHandleImpl.cancel(JobHandleImpl.java:62)
at org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobRef.cancelJob(RemoteSparkJobRef.java:54)
at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:119)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1774)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1531)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1311)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1120)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
16/04/29 03:14:52 ERROR ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
16/04/29 03:14:52 INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1461917618369 end=1461917692836 duration=74467 from=org.apache.hadoop.hive.ql.Driver>
16/04/29 03:14:52 INFO ql.Driver: Completed executing command(queryId=hive_20160429031313_ef0fd500-f203-4f36-a1db-49b7b3efaf71); Time taken: 74.467 seconds
16/04/29 03:14:52 INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
16/04/29 03:14:52 INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1461917692838 end=1461917692845 duration=7 from=org.apache.hadoop.hive.ql.Driver>
16/04/29 03:14:52 ERROR operation.Operation: Error running hive query:
org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:374)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:180)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

4 REPLIES 4

avatar
Explorer
Can someone give me a clue regarding the nature of the problem here with having a simple
Hive query execute a query on Spark ?

Do i have to extend the Spark Job Monitor time ? If so, how do i do that ?

Any suggestions out there ?


6/05/05 10:15:00 INFO log.PerfLogger: <PERFLOG method=SparkSubmitToRunning from=org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor> 16/05/05 10:16:01 INFO status.SparkJobMonitor: Job hasn't been submitted after 61s. Aborting it. 16/05/05 10:16:01 ERROR status.SparkJobMonitor: Status: SENT 16/05/05 10:16:01 INFO log.PerfLogger: </PERFLOG method=SparkRunJob start=1462461300288 end=1462461361295 duration=61007 from=org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor> 16/05/05 10:16:01 ERROR exec.Task: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)' java.lang.IllegalStateException: RPC channel is closed. at com.google.common.base.Preconditions.checkState(Preconditions.java:145) at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:276) at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:259) at org.apache.hive.spark.client.SparkClientImpl$ClientProtocol.cancel(SparkClientImpl.java:523)

 

 

 

avatar
New Contributor

 I am getting this same error on CDH 5.7 

avatar
Explorer
I'm getting the same issue on 5.8

avatar
Expert Contributor

Same for me in CDH 5.10 and latest Livy. Everything else is ok as long as Hive is not involved.