Created 04-07-2017 04:23 AM
Hi Sandeep,
This command will remove ALL HBase data... Snapshots, tables, WALs, etc. So be very careful with it.
JMS
Created 04-07-2017 04:41 AM
so when the above command delete everything i will lost all HBASE tables/WAL's. Is the above steps recommened in Production environment?
Created 04-07-2017 04:48 AM
It depends what you want to achieve, right? If you goal is to repair your production cluster, then it might be relevant. If you goal is just to test and see what it does, then what ever command it is, it should not be tested in production?
Basically, the command just does what I described. It deletes everything. It's you to decide if it's relevent for your production environment or not. If you goal is just to drop a table, or all tables, then the shell is better.
Are you facing any issue with your production environment?
Created 04-07-2017 04:50 AM
No...we don't have any issues so far with HBASE. I am just trying to understand the concept of it.
Thanks for your assistance :)
Created 04-02-2019 07:27 AM
Hello,
If the data was present for the same issue, then what is the best way of sollution ?
Can you please help me.
Thanks
Created 04-02-2019 07:48 AM
Hi Vinod,
Can you please start a different thread and share your Master logs with us?
JMS
Created 04-01-2016 11:24 PM
This Helped a lot of time and effort.. Thanks a lot..
Created 09-26-2018 05:14 AM
Hi,
Can anyone help me with this, getting this error and HBase master is getting after every successfull restart.
Could not obtain block: BP-892504517-172.31.16.10-1537889422648:blk_1073741826_1002 file=/hbase/hbase.version No live nodes contain current block Block locations: Dead nodes: . Throwing a BlockMissingException
at org.apache.hadoop.hdfs.DFSInputStream.refetchLocations(DFSInputStream.java:1040) at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:1023) at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:1002) at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:642) at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:895) at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:954) at java.io.DataInputStream.read(DataInputStream.java:149) at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:201) at org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:606) at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:689) at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:500) at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:169) at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:144) at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:704) at org.apache.hadoop.hbase.master.HMaster.access$500(HMaster.java:194) at org.apache.hadoop.hbase.master.HMaster$1.run(HMaster.java:1834) at java.lang.Thread.run(Thread.java:748)
Created 09-26-2018 06:11 AM
Hi Ayush,
Have you tied to run hdfs fsdk / ?
JM
Created 10-13-2018 03:51 AM
Hi jmspaggi,
Thanks alot for your reply.
I was able to resolve the issue by reinstalling Hbase alone.
Regards
Ayush