10-23-2017 05:19 AM
We are getting the error while executing the hive queries with spark engine.
Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
The following properties are set to use spark as the execution engine instead of mapreduce:
I tried changing the following properties also.
Solved! Go to Solution.
10-26-2017 03:57 AM
11-02-2017 04:55 AM
The error was a configuration issue. We need to either lower the executor memory (spark.executor.memory) and executor memory overhead (spark.yarn.executor.memoryOverhead) or increase the maximum memory allocation (yarn.scheduler.maximum-allocation-mb and yarn.nodemanager.resource.memory-mb)
We can refer this link
We tried changing all the combinations and the following properties gave the best result in our cluster: