i am new in hadoop developer, now i try to research with Hbase table.
i want to try to load data from my CSV file. I have more than 10 million data from my CSV file. so i want to populate it to the Hbase table.
but i do not know how to do it. Anybody can help me ?
what is step by step to populate hbase table from my CSV file ?
thank you very much, i need somebody help ..
so i must to download the importtsv again ? or function importtsv had been there since i download and install hbase for my cluster ?
i try to use command
"hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns=HBASE_ROW_KEY,f:count wordcount word_count.csv"
and i get an error like this permission denied : user = xxxxx , access = WRITE, inode ="/user":hdfs:supergroup:drwxr-xr-x
what must i do ? .. can you help me ?
sorry for bother again..
i have upload it but when i try to upload it second time with same file..
my mapreduce result is success and the output file is exist.
but my database still empty , i don't know why becuase there is no error in the log.
i only change the name of the output folder, only that..
the first time, my output file is " output " and the next i change my output file to "output2",etc
do you know why ?
i don't know why this happen.. thank you very much