Created 01-29-2016 09:58 PM
What jar(s) do I need to register or which command line options do I need to pass to Pig so that Phoenix + Pig integration works?
I'm using HDP 2.3.2. I've created a table in Phoenix and I would like to load sample data into a Phoenix table from a Hive table via a Pig script. When I try to store data into Phoenix I get "Could not resolve org.apache.phoenix.pig.PhoenixHBaseStorage using imports ...".
Created 01-29-2016 10:18 PM
@Michael Young register the Phoenix client jar from /usr/hdp/current/phoenix-client in pig script
Created 01-29-2016 10:02 PM
@Michael Young you need access to the Phoenix client jar, more reference Here
Created 01-29-2016 10:18 PM
@Michael Young register the Phoenix client jar from /usr/hdp/current/phoenix-client in pig script
Created 01-29-2016 10:22 PM
Thank you that is what I was looking for. It wasn't clear to me which jar file I needed to register.
Created 01-29-2016 10:31 PM
@Michael Young try it out and let us know. Post ayour pig script here so that others with same question can reference it. Surprised myself that it was not included in the project docs.
Created 01-29-2016 10:56 PM
I create a Phoenix table using the phoenix-sqlline client via:
create table mytable (id varchar not null primary key, mycolumn varchar);
Here is the pig script I used to attempt to load the data from Hive/HCatalog to Phoenix.
REGISTER /usr/hdp/2.3.2.0-2950/phoenix/phoenix-4.4.0.2.3.2.0-2950-client.jar A = LOAD 'default.mytable' USING org.apache.hive.hcatalog.pig.HCatLoader(); STORE A INTO 'hbase://mytable' using org.apache.phoenix.pig.PhoenixHBaseStorage('hbaseserver', '-batchSize 1000');
The pig script is run via:
pig -x tez -useHCatalog mypigscript.pig
I no longer get a "Could not resolve ..." error. However, I'm running into another error where the table in Phoenix doesn't exist.
Created 01-30-2016 12:18 AM
I had an issue with ZK.
I stopped HBase via Ambari. I ran "hbase clean --cleanZk". I then started HBase via Ambari. Now the Pig script is loading data.
@Neeraj Sabharwal @Josh ElserThanks for helping to resolve the issue via another post.