Created 03-09-2016 06:57 AM
Hi Subhash, Phoenix automatically writes the data to HBase using HBase internal APIs like put, deletes. You should run upsert or delete queries from phoenix.
Thanks for replying my question. my requirement is little bit different. i need to load data from mulesoft. currently we are using hbase api's to load data. now we are planning to install phoenix on our cluster. is phoenix support api to load data from mulesoft ??
I think currently there is no Phoenix connector for mulesoft. You can contact with mulesoft forums regarding that.
Can you export the mulesoft data as CSV? You can use Phoenix Bulkloader to load it into Phoenix. This would be the preferred way to load large amounts of data into Phoenix tables
You can also use UPSERT commands directly as suggested however that will not be as fast and you need to implement some program that reads from Mulesoft and runs the UPSERT commands through the Phoenix JDBC driver.
You can also use the Pig-> Phoenix connector if you can access the mulesoft data in Pig.
And lastly you can built a Phoenix table on top of an HBase table using a View
Phoenix has a JDBC driver. Try setting up a generic DB connector in Mulesoft )https://docs.mulesoft.com/mule-user-guide/v/3.6/database-connector). The pointing that at Phoenix. As long as you are not trying to run unsupported query expressions, you should be able to write to Phoenix as it was just another RDBMS.