Support Questions

Find answers, ask questions, and share your expertise

java.lang.NoClassDefFoundError: org/antlr/runtime/tree/CommonTree

avatar
Explorer

In CDH6.X, when running hive sql using spark execution engine, sometimes i will encounter below error, while this doesn't happen in CDH5.X:

 

scheduler.TasksetManager: Lost task 0.1 in stage 22.0(TID 37, node03, executor 1): UnknownReason
util.Utils: uncaught exception in thread task-result-getter-1
java.lang.NoClassDefFoundError: org/antlr/runtime/tree/CommonTree
at java.lang.ClassLoader.defineClass1(Native.Method)

 

If i switch to the MR execution engine, the above error is gone.

This seems to be related to the loading of classes in antlr-runtime-xxx.jar and antlr4-runtime-xx.jar under /opt/cloudera/parcels/CDH/lib/hive/lib.

keep striving!
7 REPLIES 7

avatar
Explorer

just a add on:

 

the underneath class that caused the problem, which is org/antlr/runtime/tree/CommonTree, can't be found in antlr4-runtime-xxx.jar, but can be found in antlr-runtime-xxx.jar, as below screen shows:

So we copied antlr-runtime-xxx.jar from the standard lib of hive into the standard jar lib of spark, our issue seems to be resolved by this. 

michaelli_0-1606096274993.png

 

keep striving!

avatar
New Contributor

Hi,

 

i am facing the same issue, how did you solved it ?

 

 

avatar
Guru

copied antlr-runtime-xxx.jar from the standard lib of hive into the standard jar lib of spark,

avatar
Master Collaborator

Hi , Are you using any custom jars? If yes i think you need to configure the aux path at hive side to overcome this.

avatar
Explorer

nope. I didn't use any custom/auxiliary jars.

 

I am not very sure about how jars are loaded when using spark execution engine for hive, but i do notice that class path are tailored by /opt/cloudera/parcels/CDH/lib/hive/bin/hive, as below shows, to add spark related jars, while this have nothing to do with antlr-runtime-xx.jar or antlr4-runtime-xx.jar: (so i am confused why this happens for hive on spark while not for hive on mr?)
 

# add Spark jars to the classpath
if [[ -n "$SPARK_HOME" ]]
then
CLASSPATH=${CLASSPATH}:${SPARK_HOME}/jars/spark-core*.jar
CLASSPATH=${CLASSPATH}:${SPARK_HOME}/jars/spark-unsafe*.jar
CLASSPATH=${CLASSPATH}:${SPARK_HOME}/jars/scala-library*.jar
fi

keep striving!

avatar
New Contributor

So I need to how to configure the aux path. Can you explain it more clearly or put some step or screenshot thx bro

avatar
Guru

@Orcs 

 

  • Hive aux JARs path
    1. In CDP Private Cloud Base, click Cloudera Manager > Clusters and select the HIVE. Click Configuration and search for Hive Auxiliary JARs Directory.
    2. Specify a directory value for the Hive Aux JARs property if necessary, or make a note of the path.
    3. Upload the JAR to the specified directory on all Hive metastore instances.
    4. Click Cloudera Manager > Clusters and select the HIVE-ON-TEZ. Click Configuration and search for Hive Auxiliary JARs Directory.
    5. Upload the JAR to the specified directory on all HiveServer instances.