Support Questions

Find answers, ask questions, and share your expertise

is there any way to load the csv file into phoenix 4.4.1 version

avatar

I trying to load csv file into phoenix,I'm getting below error.

Error: ERROR client.ConnectionManager$HConnectionImplementation: The node /hbase is not in ZooKeeper. It should have been written by the master. Check the value configured in 'zookeeper.znode.parent'. There could be a mismatch with the one configured in the master.

I'm using of below command in console else any alternative command on console please suggest me.

hadoop jar /usr/hdp/2.4.2.0-258/phoenix/phoenix-4.4.0.2.4.2.0-258-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --view UDM_TRANS --input /home/hadoop1/sample.csv

1 ACCEPTED SOLUTION

avatar

Hi @rathna mohan, for Phoenix 4.0 and above, you'll want to use this syntax:

HADOOP_CLASSPATH=/path/to/hbase-protocol.jar:/path/to/hbase/conf hadoop jar  /usr/hdp/2.4.2.0-258/phoenix/phoenix-4.4.0.2.4.2.0-258-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --view UDM_TRANS --input /home/hadoop1/sample.csv

Make sure to change the path to hbase-protocol.jar and the path to HBase conf to match your environment. Give this a try and let me know if it helps.

Additional Link for Phoenix Bulk Loading: https://phoenix.apache.org/bulk_dataload.html

View solution in original post

1 REPLY 1

avatar

Hi @rathna mohan, for Phoenix 4.0 and above, you'll want to use this syntax:

HADOOP_CLASSPATH=/path/to/hbase-protocol.jar:/path/to/hbase/conf hadoop jar  /usr/hdp/2.4.2.0-258/phoenix/phoenix-4.4.0.2.4.2.0-258-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --view UDM_TRANS --input /home/hadoop1/sample.csv

Make sure to change the path to hbase-protocol.jar and the path to HBase conf to match your environment. Give this a try and let me know if it helps.

Additional Link for Phoenix Bulk Loading: https://phoenix.apache.org/bulk_dataload.html