<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: how to recover hbase using hdfs data directory in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113797#M38464</link>
    <description>&lt;P&gt;Hi &lt;A href="https://community.hortonworks.com/users/2943/vxu.html"&gt;@Victor Xu&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;I followed your steps. It is working fine.&lt;/P&gt;&lt;P&gt;But i needed to restart hbase&lt;/P&gt;&lt;P&gt;Can you please suggest me any other way where I don't need to restart hbase service.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Raja&lt;/P&gt;</description>
    <pubDate>Mon, 22 Aug 2016 22:15:21 GMT</pubDate>
    <dc:creator>raja_ray</dc:creator>
    <dc:date>2016-08-22T22:15:21Z</dc:date>
    <item>
      <title>how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113791#M38458</link>
      <description>&lt;P&gt;My old hdfs data directory location - /apps/hbase/data&lt;/P&gt;&lt;P&gt;My new hdfs data directory location - /apps/hbase/data2&lt;/P&gt;&lt;P&gt;Hbase table Name - CUTOFF2&lt;/P&gt;&lt;P&gt;create 'CUTOFF2', {NAME =&amp;gt; '1'}&lt;/P&gt;&lt;P&gt;I am doing following steps to recover data. But not working. Please tell me where I am wrong-&lt;/P&gt;&lt;P&gt;hadoop fs -ls /apps/hbase/data/data/default/CUTOFF2/4c8d68c329cdb6d73d4094fd64e5e37d/1/d321dfcd3b1245d2b5cc2ec1aab3a9f2
hadoop fs -ls /apps/hbase/data2/data/default/CUTOFF2/8f1aff44991e1a08c6a6bbf9c2546cf6/1 &lt;/P&gt;&lt;P&gt;put 'CUTOFF2' , 'samplerow', '1:1' , 'sampledata'
count 'CUTOFF2' &lt;/P&gt;&lt;P&gt;su - hbase &lt;/P&gt;&lt;P&gt;hadoop fs -cp /apps/hbase/data/data/default/CUTOFF2/4c8d68c329cdb6d73d4094fd64e5e37d/1/d321dfcd3b1245d2b5cc2ec1aab3a9f2 /apps/hbase/data2/data/default/CUTOFF2/8f1aff44991e1a08c6a6bbf9c2546cf6/1&lt;/P&gt;&lt;P&gt;major_compact 'CUTOFF2'&lt;/P&gt;&lt;P&gt;Please correct my steps so recovery works.&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 11:01:32 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113791#M38458</guid>
      <dc:creator>raja_ray</dc:creator>
      <dc:date>2016-08-22T11:01:32Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113792#M38459</link>
      <description>&lt;P&gt;In &lt;CODE&gt;hbase-site.xml&lt;/CODE&gt;, you need to change the "hbase.rootdir" property to your new location.&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 11:26:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113792#M38459</guid>
      <dc:creator>sjiang</dc:creator>
      <dc:date>2016-08-22T11:26:12Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113793#M38460</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/1947/rajaray.html" nodeid="1947"&gt;@Raja Ray&lt;/A&gt;, here are the steps for recover Hfiles in another hdfs directory:&lt;/P&gt;&lt;P&gt;1. Shutdown the hbase with old hdfs path.&lt;/P&gt;&lt;P&gt;2. Change 'hbase.rootdir' to new path and restart hbase.&lt;/P&gt;&lt;P&gt;3. Create table 'CUTOFF2', so that new htable structure will be created in new hdfs path, and of course, it's empty.&lt;/P&gt;&lt;P&gt;4. Use distcp to copy hfile(s) from old path to new path in case the hfile(s) are very huge.&lt;/P&gt;&lt;P&gt;5. Do a 'hbase hbck' on the new hbase, and there should be something wrong with the 'CUTOFF2'.&lt;/P&gt;&lt;P&gt;6. Do a 'hbase hbck -repair' on the problematic table and it will finalize the recovery.&lt;/P&gt;&lt;P&gt;7. Done&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 13:13:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113793#M38460</guid>
      <dc:creator>vxu</dc:creator>
      <dc:date>2016-08-22T13:13:01Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113794#M38461</link>
      <description>&lt;P&gt;Thanks Victor. I will follow your steps and will let you know.&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 13:36:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113794#M38461</guid>
      <dc:creator>raja_ray</dc:creator>
      <dc:date>2016-08-22T13:36:59Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113795#M38462</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/2943/vxu.html" nodeid="2943"&gt;@Victor Xu&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;I followed your steps. It is working fine.&lt;/P&gt;&lt;P&gt;But i needed to restart hbase&lt;/P&gt;&lt;P&gt;Can you please suggest me any other way where I don't need to restart hbase service.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Raja&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 22:04:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113795#M38462</guid>
      <dc:creator>raja_ray</dc:creator>
      <dc:date>2016-08-22T22:04:48Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113796#M38463</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/1947/rajaray.html" nodeid="1947"&gt;@Raja Ray&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;1. Which version of hbase are you using?&lt;/P&gt;&lt;P&gt;2. When performing my steps, is there any specific error log that you can share with me?&lt;/P&gt;&lt;P&gt;3. Could you elaborate on your use case?&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Victor&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 22:10:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113796#M38463</guid>
      <dc:creator>vxu</dc:creator>
      <dc:date>2016-08-22T22:10:50Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113797#M38464</link>
      <description>&lt;P&gt;Hi &lt;A href="https://community.hortonworks.com/users/2943/vxu.html"&gt;@Victor Xu&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;I followed your steps. It is working fine.&lt;/P&gt;&lt;P&gt;But i needed to restart hbase&lt;/P&gt;&lt;P&gt;Can you please suggest me any other way where I don't need to restart hbase service.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Raja&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 22:15:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113797#M38464</guid>
      <dc:creator>raja_ray</dc:creator>
      <dc:date>2016-08-22T22:15:21Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113798#M38465</link>
      <description>&lt;P&gt;Ok, I understand. But even if you just want to change hdfs root directory for a running hbase cluster, you'll need a restart to make it work. &lt;/P&gt;&lt;P&gt;Do you mean you've already change the root path to '/apps/hbase/data2' before starting your current hbase cluster?&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 22:27:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113798#M38465</guid>
      <dc:creator>vxu</dc:creator>
      <dc:date>2016-08-22T22:27:27Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113799#M38466</link>
      <description>&lt;P&gt;In other words, there's no 'hot switch' for this 'hbase.rootdir' parameter. If you want to change it, you have to restart hbase to make it work.&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 22:31:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113799#M38466</guid>
      <dc:creator>vxu</dc:creator>
      <dc:date>2016-08-22T22:31:22Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113800#M38467</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/1947/rajaray.html" nodeid="1947"&gt;@Raja Ray&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;I checked but HBase rolling upgrade won't help here either, because HMaster and RS both use this 'hbase.rootdir' in the runtime and only changing part of them would cause data inconsistencies. So my suggestion would be create a smaller temporary hbase cluster to handle all the production requests and do a quick restart on the main hbase cluster. Modifying 'hbase.rootdir' really needs downtime.&lt;/P&gt;&lt;P&gt;Hope that will help.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Victor&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 22:46:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113800#M38467</guid>
      <dc:creator>vxu</dc:creator>
      <dc:date>2016-08-22T22:46:52Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113801#M38468</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/2943/vxu.html" nodeid="2943"&gt;@Victor Xu&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;Thanks. I understand your point.&lt;/P&gt;&lt;P&gt;I have couple of questions here to understand the scenario more clearly-&lt;/P&gt;&lt;P&gt;1. If I put data in temporary hbase cluster during main hbase cluster downtime, then how I will merge data from temporary cluster to main cluster when main cluster will be up and running.&lt;/P&gt;&lt;P&gt;2. When I am restoring data from hdfs hfile location to new location, then how I will recover  memstore data.&lt;/P&gt;&lt;P&gt;3. If I shutdown restart hbase service, is memstore data being flushed to hdfs hfile that time?&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Raja&lt;/P&gt;</description>
      <pubDate>Tue, 23 Aug 2016 16:05:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113801#M38468</guid>
      <dc:creator>raja_ray</dc:creator>
      <dc:date>2016-08-23T16:05:29Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113802#M38469</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/1947/rajaray.html" nodeid="1947"&gt;@Raja Ray&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;To answer your questions:&lt;/P&gt;&lt;P&gt;1. If I put data in temporary hbase cluster during main hbase cluster downtime, then how I will merge data from temporary cluster to main cluster when main cluster will be up and running.&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;If there are only Put operations during the main cluster downtime, you can use CopyTable tool or Export&amp;amp; Bulkload tool to migrate data from temporary cluster back to main cluster after it's up.&lt;/LI&gt;&lt;LI&gt;But if there are both Put and Delete operations during the main cluster downtime, the best way to migrate data is to set up hbase replication from temporary cluster to main cluster. This will read all WALs(Write-ahead-log) and replay both Puts and Deletes on the main cluster after it's up.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;2. When I am restoring data from hdfs hfile location to new location, then how I will recover memstore data.&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Memstore is a place in RS to keep incoming data. It will start growing when new write operations are coming.&lt;/LI&gt;&lt;LI&gt;If you mean the blockcache of the hfile, that will be reload into memory when new read operations are coming.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;3. If I shutdown restart hbase service, is memstore data being flushed to hdfs hfile that time?&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Yes, memstore would be forced to flush to hfile before RS is shutdown.&lt;/LI&gt;&lt;LI&gt;Make sure hdfs path '/apps/hbase/data/WALs/' is empty after hbase being shutdown, so that all memstore data has been flushed into hfiles.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Victor&lt;/P&gt;</description>
      <pubDate>Tue, 23 Aug 2016 16:34:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113802#M38469</guid>
      <dc:creator>vxu</dc:creator>
      <dc:date>2016-08-23T16:34:28Z</dc:date>
    </item>
    <item>
      <title>Re: how to recover hbase using hdfs data directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113803#M38470</link>
      <description>&lt;P&gt;Thanks a lot &lt;A rel="user" href="https://community.cloudera.com/users/2943/vxu.html" nodeid="2943"&gt;@Victor Xu&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;All points are clear.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Aug 2016 20:43:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/how-to-recover-hbase-using-hdfs-data-directory/m-p/113803#M38470</guid>
      <dc:creator>raja_ray</dc:creator>
      <dc:date>2016-08-24T20:43:44Z</dc:date>
    </item>
  </channel>
</rss>

