I am running a sample code from hortonworks examples here.
(1) once, I am executing, sometimes I am getting following error:
File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found.
for this error, I have /user/admin/pig folder with admin access. But why does this error come up?
And, sometimes it is hanged and stuck in RUNNING mode.
what is about this?
Many thanks for any assistance.
This is log message during stucking in RUNNING mode:
WARNING: Use "yarn jar" to launch YARN applications. 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Trying ExecType : TEZ_LOCAL 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Trying ExecType : TEZ 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Picked TEZ as the ExecType 2017-11-14 13:02:40,604 [main] INFO org.apache.pig.Main - Apache Pig version 0.16.0.2.6.0.3-8 (rexported) compiled Apr 01 2017, 22:23:34 2017-11-14 13:02:40,605 [main] INFO org.apache.pig.Main - Logging error messages to: /hadoop/yarn/local/usercache/admin/appcache/application_1510663088731_0004/container_e03_1510663088731_0004_01_000002/pig_1510664560569.log 2017-11-14 13:02:47,640 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /home/yarn/.pigbootup not found 2017-11-14 13:02:48,078 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://node1.example.ee:8020 2017-11-14 13:02:57,832 [main] INFO org.apache.pig.PigServer - Pig Script ID for the session: PIG-script.pig-2fb9e60b-4068-4d2f-8f64-319518455d04 2017-11-14 13:03:17,485 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://node1.example.ee:8188/ws/v1/timeline/ 2017-11-14 13:03:22,702 [main] INFO org.apache.pig.backend.hadoop.PigATSClient - Created ATS Hook 2017-11-14 13:03:35,896 [main] INFO org.apache.pig.Main - Pig script completed in 56 seconds and 372 milliseconds (56372 ms)
Apart from doing the following on HDFS you will need to also create a "admin" user on all the NodeManager hosts:
hdfs dfs mkdir /user/admin hdfs dfs chown admin:hadoop /user/admin hdfs dfs chmod 755 /user/admin
Below on all NM machines:
# adduser admin -g hadoop # id admin
Also please let us know which version of ambari are you using?
And have you tried setting up the hadoop proxyusers host and groups to "*".
Hello, I'm facing the same issue and i've explained it through this question:
How did you resolve your issue? I really need your help.
I'm stuck and i didn't find any convenient solution. I'll be grateful if you could help me.
unfortunately, I could not find out the solution and leave it. I only heard from someone that Pig View is not working with HDP 2.6 and over. I do not know why! If you could resolve it, please share with us.