Member since
09-01-2020
2
Posts
0
Kudos Received
0
Solutions
01-29-2021
05:47 AM
@lerner like @schhabra1 says: You can use public hortonworks repo - https://repo.hortonworks.com/content/groups/public/
... View more
01-29-2021
05:25 AM
Hi i am facing the same error for this topic, but before changing the version of shc-core the error changed to it: Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.parser.AbstractSqlParser: method <init>()V not found
at org.apache.spark.sql.execution.SparkSqlParser.<init>(SparkSqlParser.scala:42)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser$lzycompute(BaseSessionStateBuilder.scala:117)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser(BaseSessionStateBuilder.scala:116)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:292)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1104)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:145)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:144)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:144)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:141)
at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:788)
at org.apache.spark.sql.SparkSession.read(SparkSession.scala:655)
at ContactEngineRun$.main(ContactEngineRun.scala:26) -> this line of code is above val hbaseDF = sparkSession.read
.options(Map(HBaseTableCatalog.tableCatalog -> catalog))
.format("org.apache.spark.sql.execution.datasources.hbase")
.load() this is my build dependencies: val spark_version = "2.4.4" libraryDependencies ++= Seq( "org.scala-lang" % "scala-library" % "2.11.12", // SPARK LIBRARIES "org.apache.spark" %% "spark-streaming" % spark_version, "org.apache.spark" %% "spark-core" % spark_version, "org.apache.spark" %% "spark-sql" % spark_version, //HBASE "org.apache.hbase.connectors.spark" % "hbase-spark" % "1.0.0.7.2.1.0-321", "org.apache.hbase" % "hbase-client" % "2.4.0", "com.hortonworks.shc" % "shc-core" % "1.1.0.3.1.5.0-152" ) Can someone help me?
... View more