Support Questions

Find answers, ask questions, and share your expertise

Hive query on spark as execution engine in HDP 2.6.5

avatar
Expert Contributor

Hi,

We are getting the error while executing Hive query on spark as execution engine.

Hive version: 1.2.1, Spark version : 1.6

 

set hive.execution.engine=spark;
set spark.home=/usr/hdp/current/spark-client;
set hive.execution.engine=spark;
set spark.master=yarn-client;
set spark.eventLog.enabled=true;
set spark.executor.memory=512m;
set spark.executor.cores=2;
set spark.driver.extraClassPath=/usr/hdp/current/hive-client/lib/hive-exec.jar;

 

Query ID = svchdpir2d_20191106105445_a9ebc8a2-9c28-4a3d-ac5e-0a8609e56fd5
Total jobs = 1
Launching Job 1 out of 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
Starting Spark Job = c6cc1641-20ad-4073-ab62-4f621ae595c8
Status: SENT
Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html#release for an explanation.

 

Could you please help on this.

Thank you

1 ACCEPTED SOLUTION

avatar
Expert Contributor

@sampathkumar_ma - In HDP, Hive's execution engine only supports MapReduce & Tez. Running with Spark is not supported in HDP at this current moment in time.

View solution in original post

2 REPLIES 2

avatar
Expert Contributor

Can you please assist on this. Thanks

avatar
Expert Contributor

@sampathkumar_ma - In HDP, Hive's execution engine only supports MapReduce & Tez. Running with Spark is not supported in HDP at this current moment in time.