- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
java.lang.NoClassDefFoundError: org/antlr/runtime/tree/CommonTree
Created on ‎11-16-2020 11:31 PM - edited ‎09-16-2022 07:39 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In CDH6.X, when running hive sql using spark execution engine, sometimes i will encounter below error, while this doesn't happen in CDH5.X:
scheduler.TasksetManager: Lost task 0.1 in stage 22.0(TID 37, node03, executor 1): UnknownReason
util.Utils: uncaught exception in thread task-result-getter-1
java.lang.NoClassDefFoundError: org/antlr/runtime/tree/CommonTree
at java.lang.ClassLoader.defineClass1(Native.Method)
If i switch to the MR execution engine, the above error is gone.
This seems to be related to the loading of classes in antlr-runtime-xxx.jar and antlr4-runtime-xx.jar under /opt/cloudera/parcels/CDH/lib/hive/lib.
Created ‎11-22-2020 05:55 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
just a add on:
the underneath class that caused the problem, which is org/antlr/runtime/tree/CommonTree, can't be found in antlr4-runtime-xxx.jar, but can be found in antlr-runtime-xxx.jar, as below screen shows:
So we copied antlr-runtime-xxx.jar from the standard lib of hive into the standard jar lib of spark, our issue seems to be resolved by this.
Created ‎01-03-2021 05:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
i am facing the same issue, how did you solved it ?
Created ‎01-04-2021 02:43 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
copied antlr-runtime-xxx.jar from the standard lib of hive into the standard jar lib of spark,
Created ‎11-22-2020 08:55 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi , Are you using any custom jars? If yes i think you need to configure the aux path at hive side to overcome this.
Created on ‎11-23-2020 02:06 AM - edited ‎11-23-2020 02:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
nope. I didn't use any custom/auxiliary jars.
I am not very sure about how jars are loaded when using spark execution engine for hive, but i do notice that class path are tailored by /opt/cloudera/parcels/CDH/lib/hive/bin/hive, as below shows, to add spark related jars, while this have nothing to do with antlr-runtime-xx.jar or antlr4-runtime-xx.jar: (so i am confused why this happens for hive on spark while not for hive on mr?)
# add Spark jars to the classpath
if [[ -n "$SPARK_HOME" ]]
then
CLASSPATH=${CLASSPATH}:${SPARK_HOME}/jars/spark-core*.jar
CLASSPATH=${CLASSPATH}:${SPARK_HOME}/jars/spark-unsafe*.jar
CLASSPATH=${CLASSPATH}:${SPARK_HOME}/jars/scala-library*.jar
fi
Created ‎06-22-2022 01:03 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
So I need to how to configure the aux path. Can you explain it more clearly or put some step or screenshot thx bro
Created ‎06-22-2022 01:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Hive aux JARs path
- In CDP Private Cloud Base, click Cloudera Manager > Clusters and select the HIVE. Click Configuration and search for Hive Auxiliary JARs Directory.
- Specify a directory value for the Hive Aux JARs property if necessary, or make a note of the path.
- Upload the JAR to the specified directory on all Hive metastore instances.
- Click Cloudera Manager > Clusters and select the HIVE-ON-TEZ. Click Configuration and search for Hive Auxiliary JARs Directory.
- Upload the JAR to the specified directory on all HiveServer instances.
