Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Unable to instantiate SparkSession with Hive support because Hive classes are not found

avatar
New Contributor

Spark Apache showing error below, when running jobs.


+++++++++++++++++++++++++++++++++++++++++++++++++++++++
Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.

at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:878)

at br.com.recortv.dadosTratados_FaixaHoraria$.main(dadosTratados_FaixaHoraria.scala:39)

+++++++++++++++++++++++++++++++++++++++++++++++++++++++

2 REPLIES 2

avatar
Cloudera Employee

Hi,

 

are you trying to run spark-shell or pyspark on a node where the Spark Gateway is not installed by any chance? The classes should come from a JAR file called spark-hive_<version>.jar on the host in the dir

/opt/cloudera/parcels/CDH/jars 

Could you check if that exists?

Could you tell a bit more about what kind of jobs you are trying to run and from where?

 

Regards,

Zsombor

avatar
Super Collaborator

Hi @Camilo 

 

When you are sharing the exception you need to share more details. So it will help us to provide a solution in faster way.

 

1. How are you launching the spark job?

2. If you built application using maven or sbt built tool have you specified spark-hive.jar version. For example,

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-hive_2.12</artifactId>
    <version>2.4.8</version>
    <scope>provided</scope>
</dependency>

 

References:

 

1. https://stackoverflow.com/questions/39444493/how-to-create-sparksession-with-hive-support-fails-with...

2. https://mvnrepository.com/artifact/org.apache.spark/spark-hive