Created on 04-10-2023 04:22 AM - edited 04-10-2023 04:42 AM
Hi Team,
I am having two issues with cloud era Hbase connection:
1. Connecting Hbase through the Phoenix driver is not working.
2. Somehow I am able to connect hbase through hbase shell and inserted data. But I have a problem while reading data from Phoenix driver.
The steps I followed are below:
created habse Table using sqlline and inserting data through hbase shell put API from spark java application.
But my consumer app reading data through phoenix driver.
can you please help me any table configuration helps me to use hbase table from hbase shell as we all phoenix driver.
Issue with existing table is:
I am able to query data properly through hbase shell command, But when I am querying data through Phoenix driver rowkey value is getting truncated (only the first letter) and other columns are good.
While creating a table, using the following configurations:
Column_encoded_bytes=0, slatbucket=88, Compression=snappy, data_block_encoding= fast_DIFF
Created 04-14-2023 08:11 PM
still it is open question
Created 04-18-2023 10:30 PM
Hi @bavisetti, writing to phoenix tables from HBase is not supported, please consider writing Phoenix tables from Phoenix only.
Created 04-20-2023 02:25 AM
Hi @will, I tried connecting phoenix5-spark driver through CDP environment. I am facing below issues:
Exception in thread "streaming-job-executor-0" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.phoenix.spark.DataFrameFunctions.getFieldArray(DataFrameFunctions.scala:76) at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:35) at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:28) at org.apache.phoenix.spark.DefaultSource.createRelation(DefaultSource.scala:48)
I am using below jar versions:
phoenix5-spark-6.0.0.7.1.6.146-1
spark.version: 3.1.1
scala.version2.12
can you please let me know compatabilitty versions for phoenix5-spark with all mandatory jar versions for writing data through phoenix table.
Created 04-20-2023 02:28 AM
Hi @willx ,
But we should have some option to configure table creation of phoenix table with bucket configiration other than prefixrowkey partition.
Thanks,
Jyothsna