Support Questions
Find answers, ask questions, and share your expertise

File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found for PIG View

Explorer

I am running a sample code from hortonworks examples here.

(1) once, I am executing, sometimes I am getting following error:

File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found.

for this error, I have /user/admin/pig folder with admin access. But why does this error come up?

And, sometimes it is hanged and stuck in RUNNING mode.

what is about this?

Many thanks for any assistance.

6 REPLIES 6

Re: File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found for PIG View

Explorer

This is log message during stucking in RUNNING mode:

WARNING: Use "yarn jar" to launch YARN applications. 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Trying ExecType : TEZ_LOCAL 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Trying ExecType : TEZ 17/11/14 13:02:39 INFO pig.ExecTypeProvider: Picked TEZ as the ExecType 2017-11-14 13:02:40,604 [main] INFO org.apache.pig.Main - Apache Pig version 0.16.0.2.6.0.3-8 (rexported) compiled Apr 01 2017, 22:23:34 2017-11-14 13:02:40,605 [main] INFO org.apache.pig.Main - Logging error messages to: /hadoop/yarn/local/usercache/admin/appcache/application_1510663088731_0004/container_e03_1510663088731_0004_01_000002/pig_1510664560569.log 2017-11-14 13:02:47,640 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /home/yarn/.pigbootup not found 2017-11-14 13:02:48,078 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://node1.example.ee:8020 2017-11-14 13:02:57,832 [main] INFO org.apache.pig.PigServer - Pig Script ID for the session: PIG-script.pig-2fb9e60b-4068-4d2f-8f64-319518455d04 2017-11-14 13:03:17,485 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://node1.example.ee:8188/ws/v1/timeline/ 2017-11-14 13:03:22,702 [main] INFO org.apache.pig.backend.hadoop.PigATSClient - Created ATS Hook 2017-11-14 13:03:35,896 [main] INFO org.apache.pig.Main - Pig script completed in 56 seconds and 372 milliseconds (56372 ms)

Re: File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found for PIG View

Super Mentor

@Mike Bit

Apart from doing the following on HDFS you will need to also create a "admin" user on all the NodeManager hosts:

On HDFS:

hdfs dfs ­mkdir /user/admin 
hdfs dfs ­chown admin:hadoop /user/admin 
hdfs dfs ­chmod 755 /user/admin


Below on all NM machines:

# adduser admin -g hadoop
# id admin

.

Also please let us know which version of ambari are you using?

And have you tried setting up the hadoop proxyusers host and groups to "*".

Re: File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found for PIG View

Explorer

many thanks @Jay Kumar SenSharma . I will go through the deployment and let you know. I am also using Ambari 2.5.0

Re: File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found for PIG View

@Mike Bit

Hello, I'm facing the same issue and i've explained it through this question:

https://community.hortonworks.com/questions/148295/how-to-resolve-pig-error-file-not-found-exception....

How did you resolve your issue? I really need your help.

I'm stuck and i didn't find any convenient solution. I'll be grateful if you could help me.

Re: File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found for PIG View

Explorer

Dear @raouia,

unfortunately, I could not find out the solution and leave it. I only heard from someone that Pig View is not working with HDP 2.6 and over. I do not know why! If you could resolve it, please share with us.

Re: File /user/admin/pig/jobs/sumofhoursmile_14-11-2017-12-55-42/stdout not found for PIG View

@Mike Bit

Thank you for your reply. I've tried many suggested solutions through forums but without any positive result...

i will keep you informed of any updates.