Support Questions

Find answers, ask questions, and share your expertise

Using Spark Hbase Connector on CDH 6.3.2 Spark 2.4 HBase 2.1

avatar
New Contributor

Using Spark Hbase Connector. 

Cloudera distribution: 6.3.2

HBase version: 2.1.0

Scala Version: 2.11.12

Error:

Error:
spark-hbase connector version 1.1.1-2.1-s_2.11, it fails with below error java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.parse(Lorg/json4s/JsonInput;Z)Lorg/json4s/JsonAST$JValue; at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:257)

 

Searching over internet pointed out, shc-core:1.1.3-2.4-s_2.11.jar has solved the issue. 

Could not find a repository to download.

Any suggestion please?

 

Thanks.

 

6 REPLIES 6

avatar
Expert Contributor

You can use public hortonworks repo - https://repo.hortonworks.com/content/groups/public/

 

You may not find exact version which you mentioned. But you can check repo, you can use dependencies as per your cluster version.

 

You can try below  dependency :

 

<dependency>
<groupId>com.hortonworks.shc</groupId>
<artifactId>shc-core</artifactId>
<version>1.1.0.3.1.5.0-152</version>
</dependency>

 

Let me know it works. It should be compatible.

 

avatar
New Contributor

Hi i am facing the same error for this topic, but before changing the version of shc-core the error changed to it:

 

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.parser.AbstractSqlParser: method <init>()V not found
at org.apache.spark.sql.execution.SparkSqlParser.<init>(SparkSqlParser.scala:42)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser$lzycompute(BaseSessionStateBuilder.scala:117)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser(BaseSessionStateBuilder.scala:116)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:292)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1104)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:145)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:144)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:144)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:141)
at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:788)
at org.apache.spark.sql.SparkSession.read(SparkSession.scala:655)
at ContactEngineRun$.main(ContactEngineRun.scala:26) -> this line of code is above

 

val hbaseDF = sparkSession.read
  .options(Map(HBaseTableCatalog.tableCatalog -> catalog))
  .format("org.apache.spark.sql.execution.datasources.hbase")
  .load()

 

this is my build dependencies:

 

val spark_version = "2.4.4"
libraryDependencies
++= Seq(
"org.scala-lang" % "scala-library" % "2.11.12",
// SPARK LIBRARIES
"org.apache.spark" %% "spark-streaming" % spark_version,
"org.apache.spark" %% "spark-core" % spark_version,
"org.apache.spark" %% "spark-sql" % spark_version,
//HBASE
"org.apache.hbase.connectors.spark" % "hbase-spark" % "1.0.0.7.2.1.0-321",
"org.apache.hbase" % "hbase-client" % "2.4.0",
"com.hortonworks.shc" % "shc-core" % "1.1.0.3.1.5.0-152"
)


 Can someone help me?

avatar
Explorer

@SibDe Sorry to disturb you. You're using Cloudera Manager 6.3.x? Could you please help me to verify a issue I encountered now? FYI. I upgraded Cloudera Manager from 5.10.0 to 6.3.0, then the following API seems not work well.

Can you help to verify if 'wget http://<cm_server_host>:7180/api/v13/clusters/<cluster_name>/services/<service_name>/clientConfig'  can work as expected? Thanks in advance!!

 

 

avatar
New Contributor

Were you able to find an alternative since the jar doesn't seem to be available in Maven repository yet?

While trying to import through -- packages ,it throws below error:

 

Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.hortonworks#shc;1.1.3-2.4-s_2.11: not found]

avatar
New Contributor

@lerner like @schhabra1 says:

You can use public hortonworks repo - https://repo.hortonworks.com/content/groups/public/

avatar
New Contributor

Sorry for distrubing. I am working on a toy project which needs to insert spark data frame into hbase. 

 

Apache Kafka
2.2.1
Apache Spark
2.4.0
Apache HBase
2.1.4
CDH
6.3.2

After reading some posts about spark hbase connector, I decided to to hortonworks spark habse connector.

I am wondering if I need the HBase client configuration file hbase-site.xml for hortonworks spark hbase connector when I am working in CDH environment?

 

Thanks for your help in advance!