Member since
07-26-2016
11
Posts
0
Kudos Received
0
Solutions
08-08-2016
02:29 AM
Hi Sunile/SBandaru, I checked with pig -x local and got the grunt shell to LOAD, DUMP and all pig scripts work fine. I tried running pig today (without -x local) to connect to hdfs using hadoop conf, it throws following exception ERROR 2999: Unexpected internal error. Failed to create DataStorage and Server IPC version 9 cannot communicate with client version 4 Appreciate all your help/support in advance. Regards Anil Khiani
... View more
11-28-2017
09:10 AM
Hi Team, I have tried above and I see the Job status KILLED after running the workflow. After launching Oozie, I can see the workflow changing status from RUNNING to KILLED. Is there a way to troubleshoot. I can run hadoop fs -ls commands on my s3 bucket so definitely got access. I suspect its the s3 URL. I tried downloading the xml changing the URL and uploading with no luck. Any other suggestions. Appreciate all your help/support in advance. Regards Anil
... View more