Created on 08-09-2016 11:07 AM - edited 08-18-2019 03:45 AM
Hortonworks post: Hi!
I wonder about the status of the ConsumeAMQP processor for NiFi since I am interested in consuming a live RabbitMQ stream.
This stream I am currently trying to consume is a live stream generated by a remote server. I am able to for example use a connector written as a Scala script to consume the stream. In the connector I configure some parameters:
factory.setUsername
factory.setPassword
factory.setVirtualHost
factory.setHost
factory.setPort
Following the channel, queue and binding declarations:
channel.exchangeDeclare
channel.queueDeclare
channel.queueBind
What I want to do is to be able to do is to set up NiFi in my virtual HDP 2.4 environment to consume the data stream and push it into HDFS, for example. NiFi's processor seems to be easy to configure. Here is a screenshot of the ConsumeAMQP processor's configuration:
Once I introduce the parameters needed and start the processor I get the following errors:
I am a little confused when it comes to how to configure the Queue correctly, since I am not able to introduce the channel nor bind it to the Queue. The issue is not a blocked firewall since I am using the same terminal I use to Connect to the Rabbit stream via an Scala script.
I have been looking around for some tutorial or documentation page for hortonworks and RabbitMQ wihtout success.
Any help / tips / pointers and comments could be of great help!
Created 08-09-2016 12:23 PM
This could be due to auto-delete settings for queues, bindings and exchanges. I'd suggest to talk to your administrators or install a separate version of Rabbit that you can fully control.
Created 08-10-2016 03:30 PM
@Pierre Villard , I understand now. It is like you say a fairly simple situation. I have sent the adming an email requesting a permanent Queue to be defined. Do you recommend the Exchange to be set as "Direct" as well? or it is the same if it is defined as "topic"?
Created 08-10-2016 03:35 PM
@Arturo Opsetmoen Amador, it really depends of your requirements and what will be pushing data to the queue, I recommend you to check this kind of details with your system admin depending of you are trying to achieve.
Created 08-10-2016 03:38 PM
Thanks a lot for the great help, @Pierre Villard !
Created 08-09-2016 12:54 PM
@Pierre Villard, my Scala script is just consuming from the remote stream (remote publisher). I am sure from the configurations for the Rabbit Connector I was given that the queue is not durable. I will contact the admins on the remote server / Publisher to see if it is possible to change the configs to a durable queue. Thanks a lot for Your help!
Created 08-09-2016 12:23 PM
This could be due to auto-delete settings for queues, bindings and exchanges. I'd suggest to talk to your administrators or install a separate version of Rabbit that you can fully control.
Created 08-09-2016 12:51 PM
Yes, @ozhurakousky ! I think you are right, the queue is set to auto-deletion ... I will be contacting the admins on the Publisher side 🙂 thank you!