Member since
10-25-2016
14
Posts
0
Kudos Received
0
Solutions
01-10-2017
08:47 AM
i have made a test with more than 800 data inside by Hbase and MySQL. I want to compare the performance of the server with that size of data. The result is : the data that stored in the Hbase table slower than in MySQL. there are so many external noise when i install hadoop and Hbase. for example like the RAM wasn't enough and the harddisk wasn't enough too. so i install Hadoop in an environment that lower that the requirement one. But what i want to ask is why Hbase faster than MySQL when the data is large ? alright if you tell me that Hbase have map reduce for Hadoop. But i don't know how Hbase read the column inside the table when we query it. If MySQL , depend on what i read before. they said that MySQL use row based to search for a data. But why Hbase with column based faster than row based ? anyone can explain to me why ? thank you very much
... View more
Labels:
- Labels:
-
Apache HBase
-
HDFS
-
MapReduce
12-25-2016
02:43 AM
sorry for bother again.. i have upload it but when i try to upload it second time with same file.. my mapreduce result is success and the output file is exist. but my database still empty , i don't know why becuase there is no error in the log. i only change the name of the output folder, only that.. the first time, my output file is " output " and the next i change my output file to "output2",etc do you know why ? i don't know why this happen.. thank you very much
... View more
12-21-2016
01:06 AM
i have make it ! i can upload csv now thanks a lot for your help ! i appreciate that 😄
... View more
12-20-2016
11:59 PM
i try to use command "hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns=HBASE_ROW_KEY,f:count wordcount word_count.csv" and i get an error like this permission denied : user = xxxxx , access = WRITE, inode ="/user":hdfs:supergroup:drwxr-xr-x what must i do ? .. can you help me ?
... View more
12-20-2016
05:24 AM
so i must to download the importtsv again ? or function importtsv had been there since i download and install hbase for my cluster ?
... View more
12-17-2016
11:58 PM
i am new in hadoop developer, now i try to research with Hbase table. i want to try to load data from my CSV file. I have more than 10 million data from my CSV file. so i want to populate it to the Hbase table. but i do not know how to do it. Anybody can help me ? what is step by step to populate hbase table from my CSV file ? thank you very much, i need somebody help ..
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Zookeeper
-
HDFS
11-29-2016
10:02 AM
i am new in hbase developer, and i have a problem with my hbase. for the first time, i can use hbase thrift server and can use hbase browser in hue very soft but when my computer restart, and i start manually again the hbase thrift by command "hbase thrift start" in terminal. it returns "user root/myusername is not allowed to impersonate admin". i don't know why. i have changed all the configuration. add "hadoop.proxyuser.hbase.hosts" and "hadoop.proxyuser.hbase.groups" with value "*" and still don't change the error in the hue web UI. anybody can help me ? please i need somebody help, i can't continue my thesis.. thank you very much.
... View more
Labels:
- Labels:
-
Apache HBase
-
Cloudera Hue
-
HDFS
10-25-2016
09:16 AM
i am new in cloudera and hadoop developer. i have tried to use cloudera manager in my ubuntu 14.04ls and install cloudera manager there. and i use parcels as my choice when we need to choose between parcel or package, and install cloudera 5.4.3 but when i finished my installation progress. i got all of my service were bad health. and one of the warning or error is about free space. is that any connection or relation between bad health and deficieny of my free space/hard disk in my virtual machine?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Cloudera Manager