- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Export HBase data to csv
- Labels:
-
Apache HBase
Created ‎05-15-2016 01:40 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
How to export hbase data to csv? Table or entire database (table by table). I have always used/built a map/reduce job to do this. However, I understand apache Pherf has these capabilities. I have also used phoenix to create csv:
1. !outputformat csv >>>> 2. !record data.csv >>>> 3. select * from mytable; >>>> 4. !record >>>> 5. !quit
I have also used hbase export table which create a hadoop sequence file on a target hdfs directory. I basically create a hive table on top of this sequence file and select * into another table which uses csv storage/file format. This requires a few steps and not too complicated.
How else folks? I am looking for the "easy" button here.
Created ‎05-18-2016 10:18 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
For easy button, read with Pig using HBaseStorageHandler and write to PigStorage or CSVExcelStorage
https://pig.apache.org/docs/r0.11.0/api/org/apache/pig/backend/hadoop/hbase/HBaseStorage.html
https://pig.apache.org/docs/r0.12.0/api/org/apache/pig/piggybank/storage/CSVExcelStorage.html
Created ‎05-15-2016 07:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can create a Hive external table mapped onto your HBase table using HBaseStorageHandler, see the example at the end of the Usage section, and then, as what you did with your Sequence file, "select *" from this table into a csv table (stored as textfile fields terminted by ',').
Created ‎10-04-2017 05:59 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am getting mapreduce error. It starts, but fails within a few minutes. do you have a working example?
Created ‎05-15-2016 12:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can also use HDF or Spark if you need to do some interesting things with it
Created ‎05-15-2016 07:08 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Sunile Manjee Try the Export utility tool that comes as part of HBase, that exports it into hdfs. Try something like following
bin/hbase org.apache.hadoop.hbase.mapreduce.Export table_name file:///tmp/db_dump/
It can also be done using happybase library. Here's an example
Created ‎05-18-2016 10:18 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
For easy button, read with Pig using HBaseStorageHandler and write to PigStorage or CSVExcelStorage
https://pig.apache.org/docs/r0.11.0/api/org/apache/pig/backend/hadoop/hbase/HBaseStorage.html
https://pig.apache.org/docs/r0.12.0/api/org/apache/pig/piggybank/storage/CSVExcelStorage.html
Created ‎06-13-2016 04:34 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Sunile Manjee,
I am following a hbase export table technique. I did an export, created a hive table stored as sequencefile but if I am loading the sequence file data into Hive table, its giving me the error:
java.lang.RuntimeException: java.io.IOException: WritableName can't load class: org.apache.hadoop.hbase.io.ImmutableBytesWritable
It would be really helpful if you let me know the solution. Thanks.
Created ‎06-13-2016 07:59 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Do you mind opening a seperate HCC post on your question?
Created ‎06-13-2016 08:28 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sure @Sunile Manjee, let me do it.
