Error: Error while compiling statement: FAILED: SemanticException Failed to get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client. (state=42000,code=40000)
Some of my hive jobs submitted through oozie fail with this error. Hive editor is used throughout the day and we mostly have our jobs scheduled through oozie the entire time of the day. We cannot afford for any of our jobs to fail.
What can I do to avoid this error?
Here's my cluster configuration:
What can I do to accept any number of connections to spark? I cannot afford any of my jobs failing
Yarn containers are allowed a memory of 6GB and it has worked fine with Map Reduce.
Spark Executor Cores : 4
Spark Executor Maximum Java Heap Size: 2 GB
Spark Driver Memory Overhead:26 MiB
Spark Executor Memory Overhead: 26 MiB
My node configurations:
1 Master node with spark server on it: 16vCPU, 64GB memory
3 worker nodes with HDFS and YARN on it: 16vCPU, 64GB memory
What should be the values for above-mentioned parameters?
I am guessing it to be 6 executors and 25GB heap size with 7GB executor memory overhead