Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

spark hbase connector offset length exceed capacity of the array.

spark hbase connector offset length exceed capacity of the array.

New Contributor

38578-sampledata.png

Using Spark Hbase connector i was trying to write & read from Hbase.
I am able to write dataframe to Hbase but during read operation fails
"java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 2".

The datatype with int/double/float gives this error. String works though.

Should i explicitly do the datatype conversion ? Please let me know any suggestion.
Below is the write and read code i am using.

//Write Dataframe
dataFrame.write.options(Map(HBaseTableCatalog.tableCatalog -> cataLog, HBaseTableCatalog.newTable -> "5"))  
.format("org.apache.spark.sql.execution.datasources.hbase").save()

//Read Dataframe
sqlContext.read.options(Map(HBaseTableCatalog.tableCatalog->cataLog))
  .format("org.apache.spark.sql.execution.datasources.hbase").load()

//table schema
catalog = s"""{
                 |"table":{"namespace":"default", "name":"shctable"},
                 |"rowkey":"AccountNumber",
                 |"columns":{
                 |"AccountNumber":{"cf":"rowkey", "col":"AccountNumber", "type":"string"},
                 |"col1":{"cf":"cf1", "col":"col1", "type":"int"},
                 |"col2":{"cf":"cf2", "col":"col2", "type":"int"}
                 |}
                 |}""".stripMargin.

Environment Details:
Spark: 1.6.3
Hbase : 1.1.2
SparkHbase Connector: 1.1.1-1.6-s_2.10
HDP Version: 2.6

//Sample Dataframe image attached below

Don't have an account?
Coming from Hortonworks? Activate your account here