Member since
03-17-2018
7
Posts
2
Kudos Received
0
Solutions
05-31-2018
02:26 PM
I found answer by myself ... use df.saveToPhoenix(Map("table" -> "OUTPUT_TABLE", "zkUrl" -> hbaseConnectionString))
... View more
05-31-2018
02:25 PM
1 Kudo
Hi Anusuya, In spark2, in order to save your dataframe to Phoenix table ... instead of df.save("org.apache.phoenix.spark", SaveMode.Overwrite, Map("table" -> "OUTPUT_TABLE", "zkUrl" -> hbaseConnectionString)) use df.saveToPhoenix(Map("table" -> "OUTPUT_TABLE", "zkUrl" -> hbaseConnectionString))
... View more
03-27-2018
05:48 PM
How can I save this "df" to other table of Phoenix ?
... View more
03-22-2018
06:59 AM
Hi all, We upgraded our cluster from HDP 2.5.0 to HDP 2.6.3. Whichever external integrations (outside of HDP stack) that we did with Phoenix of HDP2.5.0 are not working now with Phoenix of HDP2.6.3. I did replace jars & configuration files with the version of HDP2.6.3 in external systems. I am unable to understand issue here, since Phoenix(4.7) & HBase(1.1.2) version is same here.
... View more
03-17-2018
07:00 AM
Thanks alot @Sandeep Nemuri. It worked (y)
... View more