Support Questions

Find answers, ask questions, and share your expertise

Spark SQL Error connecting Hive Metastore

avatar

Hi All,

I'm getting the following Error when trying to connect to Hive from My local Windows 10 machine. i have configured all Hadoop and Spark in my local and its Executing fine when reading as HDFS file its working fine only issue with Hive onle please help me out on this.

Exception in thread "main" java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT at org.apache.spark.sql.hive.HiveUtils$.hiveClientConfigurations(HiveUtils.scala:200) at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:265) at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) at org.apache.spark.sql.hive.HiveExternalCatalog$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195) at org.apache.spark.sql.hive.HiveExternalCatalog$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) at org.apache.spark.sql.hive.HiveExternalCatalog$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194) at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105) at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93) at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35) at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289) at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$instantiateSessionState(SparkSession.scala:1059) at org.apache.spark.sql.SparkSession$anonfun$sessionState$2.apply(SparkSession.scala:137) at org.apache.spark.sql.SparkSession$anonfun$sessionState$2.apply(SparkSession.scala:136) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:136) at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:133) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:632) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:691) at MyProgram$.main(MyProgram.scala:54) at MyProgram.main(MyProgram.scala)

Process finished with exit code 1

1 ACCEPTED SOLUTION

avatar

Issue resolved by adding SBT dependency to my project based on my Hive-metastore version available in hive->lib directory.

View solution in original post

3 REPLIES 3

avatar
@Manikandan Jeyabal

What are the Spark and Hive versions? If you have Hive 2.x and Spark version below 2.2, this is a known issue and was fixed in Spark 2.2

Here is the Jira Link .

avatar

@Rahul Soni

Im Using Spark <2.2.1> and Hive <2.4.2.129-1> still im getting this issue.

avatar

Issue resolved by adding SBT dependency to my project based on my Hive-metastore version available in hive->lib directory.