Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

hive on spark engine

hive on spark engine

Explorer

Hi All,

I am trying to run hive on spark engine. cluster details: HDP 2.6/ Ambari 2.5 /hive-1.2.1/saprk-1.6

I have cpoied spark assembly jar file to hive /lib/ directory and set hive.execution.engine=spark and tried to run query through an error as below .

hive> set hive.execution.engine=spark; hive> select count(*) from sparksql_query.final_result; Query ID = user_20180906105558_d4566904-e2a4-4d94-b91b-404bdd6c1b12 Total jobs = 1 Launching Job 1 out of 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Starting Spark Job = 61362f50-0bd4-4676-bfef-87d433386a94 Status: SENT Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask hive>

Any sigggestion will helps me a lot

kant

1 REPLY 1
Highlighted

Re: hive on spark engine

@Kant T

Hive on Spark is not supported, hence the error. You may try SparkSQL if you want to leverage Spark.