Created 07-13-2022 11:52 AM
Spark Apache showing error below, when running jobs.
+++++++++++++++++++++++++++++++++++++++++++++++++++++++
Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:878)
at br.com.recortv.dadosTratados_FaixaHoraria$.main(dadosTratados_FaixaHoraria.scala:39)
+++++++++++++++++++++++++++++++++++++++++++++++++++++++
Created 07-19-2022 01:49 PM
Hi,
are you trying to run spark-shell or pyspark on a node where the Spark Gateway is not installed by any chance? The classes should come from a JAR file called spark-hive_<version>.jar on the host in the dir
/opt/cloudera/parcels/CDH/jars
Could you check if that exists?
Could you tell a bit more about what kind of jobs you are trying to run and from where?
Regards,
Zsombor
Created 08-31-2022 08:59 PM
Hi @Camilo
When you are sharing the exception you need to share more details. So it will help us to provide a solution in faster way.
1. How are you launching the spark job?
2. If you built application using maven or sbt built tool have you specified spark-hive.jar version. For example,
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.12</artifactId>
<version>2.4.8</version>
<scope>provided</scope>
</dependency>
References:
2. https://mvnrepository.com/artifact/org.apache.spark/spark-hive