Created 09-08-2016 12:01 PM
Hi All,
We are trying to migrate our existing RDBMS(Sql Database) system to hadoop. We are planning to use hbase for the same. But we are not getting how to denormalize sql data to store it in hbase column format.
Is it possible? If yes then what would be the best approach for that?
Which hbase version is required for this?
Any suggestions.
Created 09-08-2016 12:51 PM
I would:
3. Then do a bulk import into your hbase table for each tsv. See the following links on bulk imports. (Inserting record by record will be much too slow for large tables.
I have used this workflow frequently, including loading 2.53 billion relational records into a HBase table. The more you do it, the more automated you find yourself making it.
Created 09-08-2016 12:51 PM
I would:
3. Then do a bulk import into your hbase table for each tsv. See the following links on bulk imports. (Inserting record by record will be much too slow for large tables.
I have used this workflow frequently, including loading 2.53 billion relational records into a HBase table. The more you do it, the more automated you find yourself making it.