Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to use HBase with Spark 2?

How to use HBase with Spark 2?

New Contributor

I'm updating the versions from a project (in Scala) and I'm having some problems with Hbase. The version I updated to are:

  • scala-library -> 2.11.12
  • scalatest_2.11 -> 3.0.8
  • spark-streaming_2.11 -> 2.4.3
  • spark-sql_2.11 -> 2.4.3
  • spark-hive_2.11 -> 2.4.3

Hereupon, I can't get any hbase-spark that works, having always the same compiling error:

 

Error: Symbol 'type org.apache.spark.Logging' is missing from the classpath. This symbol is required by 'class org.apache.hadoop.hbase.spark.HBaseContext'. Make sure that type Logging is in your classpath and check for conflicting dependencies with -Ylog-classpath.

A full rebuild may help if 'HBaseContext.class' was compiled against an incompatible version of org.apache.spark.

    val hBaseContext = new HBaseContext(sc, hBaseConf)

Don't have an account?
Coming from Hortonworks? Activate your account here