Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive on Spark Jobs Succeed but Never Exit.


Hive on Spark Jobs Succeed but Never Exit.


Hey All,


We have third party tools submitting Hive on Spark jobs.  These jobs finish successfully but the connections they create continue to show in CM and never close, taking up the queue.  Based on our testing, the Spark shell these apps create remain open.  We're running CDH 5.7.1. 


When trying to replicate this issue in another environment using the same third party tools, Talend & Jupyter, we could not replicate the issue however the jobs ran to simulate were not the exact same ones we run in our higher level environmnt where the issue is occurring.


Has anyone ran into this before and could share their solution?  


Could it be the applications not sending a 'closed connection' signal or a F/W blocking specific traffic preventing connection from closing?  Logs from containers, Hive, AM don't list anything obvious but perhaps we're not seeing something we should be expecting in the logs?




Don't have an account?
Coming from Hortonworks? Activate your account here