Support Questions

Find answers, ask questions, and share your expertise

Import Data from MongoDb to HBase using Spark

avatar
Rising Star

Hi All,

Can someone please kind to tell me how to import the data from MongoDB to HBase using Spark or Without Using Spark.

If not any other Way.

Regards,

Vijay

1 ACCEPTED SOLUTION

avatar
Expert Contributor

@Vijay Kumar J

Have you considered using Apache NiFi for this?

NiFi has inbuilt processors to work with data in both MongoDB and HBase.

You could use NiFi's GetMongo processor followed by the PutHbaseJSON processor to move the data from MongoDB to HBase.

Check out the following article for more info on using NiFi to interact with MongoDB:

https://community.hortonworks.com/articles/53554/using-apache-nifi-100-with-mongodb.html

View solution in original post

5 REPLIES 5

avatar
Super Guru

avatar
Rising Star

@Rajkumar Singh

I did tried this example. But my Mongo Json data is Very complex

avatar
Expert Contributor

@Vijay Kumar J

Have you considered using Apache NiFi for this?

NiFi has inbuilt processors to work with data in both MongoDB and HBase.

You could use NiFi's GetMongo processor followed by the PutHbaseJSON processor to move the data from MongoDB to HBase.

Check out the following article for more info on using NiFi to interact with MongoDB:

https://community.hortonworks.com/articles/53554/using-apache-nifi-100-with-mongodb.html

avatar
Rising Star

@Laurence Da Luz

Thanks for the response.

What if the process fails while fetching (or) storing the data.

Is it feasible for production use.

How to execute this NIFI on Hourly basis.

avatar
Expert Contributor
@Vijay Kumar J

NiFi is definitely feasible for production use, and it is perfectly suited for your MongoDB to HBase data movement use case. NiFi is a tool used for managing dataflow and integration between systems in an automated and configurable way. It allows you to stream, transform, and sort data and uses a drag-and-drop UI.

Dealing with failures - NiFi is configurable - when you build your flow within NiFi you can determine how you want to handle failures. In your case, you could build a flow in NiFi that retries on failure, and sends out an email on failure (this is an example, how you want to handle failures for fetch and storing data can be configured however you need)

Execute NiFi on an Hourly Basis - NiFi isn't like traditional data movement schedulers, and flows built using NiFi are treated as 'always-on' where data can be constantly streamed as it is received. That being said, NiFi provides the ability to schedule each processor if needed, so in your case you could have your GetMongo processor set to run once every hour, and your PutHBaseJSON processor to push data to HBase as soon as it is received from the GetMongo processor.

Check out this tutorial for getting started and building your first flow: http://hortonworks.com/hadoop-tutorial/learning-ropes-apache-nifi/