02-19-2019 12:25 PM - last edited on 02-21-2019 05:40 AM by cjervis
I have installed and setup Kafka (KAFKA-3.1.1-220.127.116.11.p0.2) in Cloudera Manager (Cloudera Enterprise 5.14.3) successfully. I have also configured and setup a Splunk connector to allow Splunk to consume Cloudera Audit data.
However, I have to manually launch the connect-distributed.sh script and register the Splunk Sink connector if something fails. If the server is restarted I would have log into the server and manually run the 2 commands (curl) to get the distributed service (or maybe I should call it a role) running and to register it with the Splunk service.
Is there a way to run scripts automatically when Cloudera Manager is used to restart Kafka?
If not, I'm thinking I will create a Python based framework that runs in cron and checks the health of the connect-distributed.sh service and re-run it if it is down.
02-27-2019 11:26 AM
02-27-2019 11:36 AM
Unfortunately, I'm not a Java programmer. I do PHP, Python and bash and decided not to develop in Java many years ago as it appeared to be way too complex compared to other solutions. Also, at that point in time, Java was still having speed issues being psuedo compiled. The language I was using at that time was fully compilable.
Also, I assumed that Cloudera would get around to it eventually, as it's a part of the tools that appear to be under the CM support umbrella. Or, at least it's installed with the Kafka CSD.
I would think some kind of CSD to generically allow scripts to be run based on service/role events would be ideal???
Thanks for finally getting back to me!
Have a great day!