- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Import Data from MongoDb to HBase using Spark
- Labels:
-
Apache HBase
-
Apache Spark
Created ‎09-13-2016 05:55 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi All,
Can someone please kind to tell me how to import the data from MongoDB to HBase using Spark or Without Using Spark.
If not any other Way.
Regards,
Vijay
Created ‎09-13-2016 07:37 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have you considered using Apache NiFi for this?
NiFi has inbuilt processors to work with data in both MongoDB and HBase.
You could use NiFi's GetMongo processor followed by the PutHbaseJSON processor to move the data from MongoDB to HBase.
Check out the following article for more info on using NiFi to interact with MongoDB:
https://community.hortonworks.com/articles/53554/using-apache-nifi-100-with-mongodb.html
Created ‎09-13-2016 06:02 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
did you try this?
Created ‎09-13-2016 07:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I did tried this example. But my Mongo Json data is Very complex
Created ‎09-13-2016 07:37 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have you considered using Apache NiFi for this?
NiFi has inbuilt processors to work with data in both MongoDB and HBase.
You could use NiFi's GetMongo processor followed by the PutHbaseJSON processor to move the data from MongoDB to HBase.
Check out the following article for more info on using NiFi to interact with MongoDB:
https://community.hortonworks.com/articles/53554/using-apache-nifi-100-with-mongodb.html
Created ‎09-13-2016 10:47 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for the response.
What if the process fails while fetching (or) storing the data.
Is it feasible for production use.
How to execute this NIFI on Hourly basis.
Created ‎09-13-2016 12:46 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
NiFi is definitely feasible for production use, and it is perfectly suited for your MongoDB to HBase data movement use case. NiFi is a tool used for managing dataflow and integration between systems in an automated and configurable way. It allows you to stream, transform, and sort data and uses a drag-and-drop UI.
Dealing with failures - NiFi is configurable - when you build your flow within NiFi you can determine how you want to handle failures. In your case, you could build a flow in NiFi that retries on failure, and sends out an email on failure (this is an example, how you want to handle failures for fetch and storing data can be configured however you need)
Execute NiFi on an Hourly Basis - NiFi isn't like traditional data movement schedulers, and flows built using NiFi are treated as 'always-on' where data can be constantly streamed as it is received. That being said, NiFi provides the ability to schedule each processor if needed, so in your case you could have your GetMongo processor set to run once every hour, and your PutHBaseJSON processor to push data to HBase as soon as it is received from the GetMongo processor.
Check out this tutorial for getting started and building your first flow: http://hortonworks.com/hadoop-tutorial/learning-ropes-apache-nifi/
