Community Articles
Find and share helpful community-sourced technical articles.
Labels (1)

Consolidating below some of the errors thrown by Spark Thrift Server during SQL execution, that could be worked around by configuring certain parameters of spark-thrift-sparkconf.conf and hive-site.xml

Error 1:

Join condition is missing or trivial.Use the CROSS JOIN syntax to allow cartesian products between these relations.;

Resolution: spark.sql.crossjoin.enabled: true

Error 2:

Caused by: org.codehaus.janino.JaninoRuntimeException: Code of method "eval(Lorg/apache/spark/sql/catalyst/InternalRow;)Z" of class "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificPredicate" grows beyond 64 KB

Resolution: spark.sql.codegen.wholeStage : false

Error 3:

java.lang.OutOfMemoryError: Java heap space

Resolution: spark.driver.memory : 10g <to a higher-value>

spark.sql.ui.retainedExecutions: 5 <to some lower-value>

Error 4:

org.apache.spark.SparkException: Exception thrown in awaitResult: (state=,code=0)

Resolution: false (in hive-site.xml)

To enable heap dump collection for spark driver and executors, for debugging out of memory errors

spark.driver.extraJavaOptions: '-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=<path-to-dump-file-location>'

spark.executor.extraJavaOptions: '-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=<path-to-dump-file-location>'

; ;