We use an application which uses Hive and spark jobs. We want to monitor from our application (edge node of hadoop) to check if the connection to hive and spark working or not using script... hive uses beeline with logins,we cannot use login in the script, is there any other way to monitor hive and spark jdbc connections? using curl?
If you would like to just check if your hive server2 port is accepting connections or not you can run checks like below.
nc -zv <hiveserver2 hostname> 10000
Connection to localhost 10000 port [tcp/ndmp] succeeded!
hive metastore connections:
netstat -all|grep 9083|wc -l
But note above doesn't guarantee whether your hive service is really running any tasks or not. Best way for figuring that out is to enable hive metrics which you can poll every few minutes as per your requirement. If you have Grafana setup it should help as well.
ps: I'm not sure what distribution you are using, however, if you are on HDP, ambari already has run service check option which you can see if you can invoke it via rest api.