Created 10-11-2016 08:20 PM
what protocol is used for the new spark hbase connector? is spark using the hbase thrift?
Created 10-11-2016 08:43 PM
Integration between Spark and HBase relies simply on HBaseContext which provides HBase configuration to Spark Executor. So, to answer which protocol is used, the answer is simple RPC. Please check following link for more details.
https://hbase.apache.org/book.html#spark
and here is the github link to HBase Spark module.
https://github.com/apache/hbase/tree/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark
Created 10-11-2016 08:24 PM
Depending on what data your Spark application attempts to access, it uses the relevant JVM HBase client APIs: filters, scans, gets, range-gets, etc.
See the code here.
Created 10-11-2016 08:43 PM
Integration between Spark and HBase relies simply on HBaseContext which provides HBase configuration to Spark Executor. So, to answer which protocol is used, the answer is simple RPC. Please check following link for more details.
https://hbase.apache.org/book.html#spark
and here is the github link to HBase Spark module.
https://github.com/apache/hbase/tree/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark