Support Questions

Find answers, ask questions, and share your expertise

Using yarn logs command

avatar
Explorer

Hi,

 

If I try and get the logs for an application like this:

yarn logs -applicationId application_1575531060741_10424

The command fails because I am not running it as the application owner.  I need to run it like this:

yarn logs -applicationId application_1575531060741_10424 -appOwner hive

 

The problem is I want to write out all the yarn logs to the os so I can ingest them into splunk.  If I try and figure out the appOwner for each application then this is awkward and time consuming even in a script.  

 

Is there a better way to dump all the yarn logs to the os ?

 

Thanks

 

 

 

2 ACCEPTED SOLUTIONS

avatar
Expert Contributor

Hi @Daggers 

 

I think you can try this -

 

1. Below properties decides the path for storing yarn logs in hdfs -

Belos is sample example from my cluster -

yarn.nodemanager.remote-app-log-dir = /app-logs
yarn.nodemanager.remote-app-log-dir-suffix = logs-ifile

2. You can do "hadoop dfs -copyToLocal" for above path which will copy all applications to local and then you can pass to splunk ?

 

Do you think that can work for you?

Let me know if you have more questions on above.

View solution in original post

avatar
Expert Contributor

Hi @Daggers 

 

You can write simple script using yarn rest api to fetch only completed applications [month/daywise] and copy only those applications from hdfs to local. Please check below link -

 

https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html

yarnrestapi.PNG

 

View solution in original post

6 REPLIES 6

avatar
Expert Contributor

Hi @Daggers 

 

I think you can try this -

 

1. Below properties decides the path for storing yarn logs in hdfs -

Belos is sample example from my cluster -

yarn.nodemanager.remote-app-log-dir = /app-logs
yarn.nodemanager.remote-app-log-dir-suffix = logs-ifile

2. You can do "hadoop dfs -copyToLocal" for above path which will copy all applications to local and then you can pass to splunk ?

 

Do you think that can work for you?

Let me know if you have more questions on above.

avatar
Expert Contributor

@Daggers 

 

You can also check for HDFS NFS gateway which will allow hdfs filesystem to mount on local OS exposed via NFS.

 

https://hadoop.apache.org/docs/r2.8.0/hadoop-project-dist/hadoop-hdfs/HdfsNfsGateway.html

avatar
Explorer

Hey great suggestion ! I think this might work.

avatar
Explorer

I wonder if there is a way to ensure that all the files have finished being written to /tmp/log (The location at my site of yarn.nodemanager.remote-app-log-dir) before I copy them ?

avatar
Expert Contributor

Hi @Daggers 

 

You can write simple script using yarn rest api to fetch only completed applications [month/daywise] and copy only those applications from hdfs to local. Please check below link -

 

https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html

yarnrestapi.PNG

 

avatar
Expert Contributor

Hi @Daggers 

 

Please feel free to select best answer if your questions are answered to close the thread.

 

Thanks