I am running a spark job to write a DF into a hive table(Partitioned). It successfully writes into a hive table but doesnt shutdown gracefully and hangs in that state for ever. it throws following WARN message just as it invokes shutdown process
17/01/20 12:34:18 WARN TaskSetManager: Lost task 70.0 in stage 298.0 (TID 33232, remotehost.com): org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException): No lease on /apps//.hive-staging_hive_2017-01-20_12-14-43_871_2636939455154836968-1/-ext-10000/_temporary/0/_temporary/attempt_201701201214_0298_m_000070_0/datadate=2015-11-15/part-00070 (inode 10141180694): File does not exist. Holder DFSClient_NONMAPREDUCE_311350472_53 does not have any open files.
Do you know why this happens and whats the fix?Please let me know