Hello, I am using Hive 2.3.0 with spark version 2.0.2, When i am trying to run Hive commands on Spark from hive console,
The Job is getting stuck and i have to manually kill it.
Following was the error message in Spark worker Log. Could you please advise if i am doing something wrong.
INFO worker.Worker: Executor app-20171114093447-0000/0 finished with state KILLED exitStatus 143
Its probably a Spark config issue, can you share the detail log, the information you share doesn't give enough information to identify root cause