Hi @Nishant Bangarwa,
Thank you so much for your answers, in the first link http://druid.io/docs/latest/development/extensions-core/kafka-ingestion.html
it is the way we can ingest data to druid data from kafka using supervisor (I done it with kafka). However, I dont think I can do similar with Rabbitmq or other streaming as Spark, Storm ... Therefore, I want to use realtime node to ingest data to druid and test more with Rabbitmq, Spark, Storm ...
Do you have any suggestion for me to create realtime node and run it?
Many thanks