Trying to execute PIG script from PIG view in Ambari, according to the steps outlined in Hadoop Tutorial. The job executing this script fails without giving any error message. Both Results and Logs section are empty. I also was not able to find any PIG related logs within /var/log directory.
Where else I should look for more information?
@Dmitry Otblesk looks like your application master is requesting more memory than allowed from YARN. You might want to review a tutorial on YARN Capacity Scheduler and tuning queues. A quick workaround for now would be to throttle the following property
@dmitry is the Pig service check running fine? Also, check Ambari > YARN > Quick Links > ResourceManager UI, this will have the application id and name with something like "PigLatin:<scriptname>.sh", check the logs in it.
Actually PIG service check fails. The error message I see is
org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request, requested memory < 0, or requested memory > max configured, requestedMemory=1024, maxMemory=256
How can I stop Pig requesting such a big memory (as I do not have much available)?
One more piece of information was found. One of the jobs was showing the following error message
Container [pid=11953,containerID=container_e23_1482543044221_0003_01_000002] is running beyond physical memory limits. Current usage: 263.3 MB of 256 MB physical memory used; 3.9 GB of 537.6 MB virtual memory used. Killing container
Apparently it had something to do with the lack of memory. So once again, how can I configure Pig to run on low memory and do not request it too much. I only need to process few records, this should not really require huge memory.
@Dmitry Otblesk Try setting below: From Ambari -> Yarn -> Configs ->
yarn.scheduler.maximum-allocation-mb = 1024 yarn.nodemanager.resource.memory-mb = 1024
Restart required services and rerun the service check again. Once that goes through, pig script should execute fine.