Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hbase table issue while reading data through phoenix driver

avatar
Explorer

Hi Team,

I am having two issues with cloud era Hbase connection:
1. Connecting Hbase through the Phoenix driver is not working.

2. Somehow I am able to connect hbase through hbase shell and inserted data. But I have a problem while reading data from Phoenix driver.

The steps I followed are below:
created habse Table using sqlline and inserting data through hbase shell put API from spark java application. 
But my consumer app reading data through phoenix driver.

can you please help me any table configuration helps me to use hbase table from hbase shell as we all phoenix driver. 

Issue with existing table is: 

I am able to query data properly through hbase shell command, But when I am querying data through Phoenix driver rowkey value is getting truncated (only the first letter) and other columns are good.

While creating a table, using the following configurations:
Column_encoded_bytes=0, slatbucket=88, Compression=snappy, data_block_encoding= fast_DIFF

13 REPLIES 13

avatar
Explorer

still it is open question

avatar
Master Collaborator

Hi @bavisetti, writing to phoenix tables from HBase is not supported, please consider writing Phoenix tables from Phoenix only.

avatar
Explorer

Hi @will, I tried connecting phoenix5-spark driver through CDP environment. I am facing below issues:

Exception in thread "streaming-job-executor-0" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.phoenix.spark.DataFrameFunctions.getFieldArray(DataFrameFunctions.scala:76) at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:35) at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:28) at org.apache.phoenix.spark.DefaultSource.createRelation(DefaultSource.scala:48)

 

 I am using below jar versions:
phoenix5-spark-6.0.0.7.1.6.146-1
spark.version: 3.1.1
scala.version2.12

 

can you please let me know compatabilitty versions for phoenix5-spark with all mandatory jar versions for writing data through phoenix table.

avatar
Explorer

Hi @willx ,

But we should have some option to configure table creation of phoenix table with bucket configiration other than prefixrowkey partition.

Thanks,

Jyothsna