Member since
07-26-2016
24
Posts
7
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3559 | 03-07-2018 11:16 PM | |
26364 | 08-20-2017 12:08 AM | |
1498 | 06-01-2017 08:26 PM | |
1549 | 06-01-2017 05:59 PM | |
2083 | 05-20-2017 12:30 AM |
01-06-2020
09:37 AM
Hi, The below mentioned links will more information on your clarifications https://spark.apache.org/docs/latest/configuration.html Thanks Arun
... View more
03-31-2017
09:25 PM
2 Kudos
To clear local file cache and user cache for yarn, perform the following:
Find out the cache location by checking the value of the yarn.nodemanager.local-dirs property : <property>
<name>yarn.nodemanager.local-dirs</name>
<value>/hadoop/yarn/local</value>
</property>
Remove filecache and usercache folder located inside the folders that is specified in yarn.nodemanager.local-dirs. [yarn@node2 ~]$ cd /hadoop/yarn/local/
[yarn@node2 local]$ ls filecache nmPrivate spark_shuffle usercache
[yarn@node2 local]$ rm -rf filecache/ usercache/
If there are more than one folder, clean them one by one. Restart YARN service.
... View more
Labels:
03-31-2017
05:57 PM
Perform following steps to enable the verbose log for Oozie launcher
Step 1.Add below property to the action configuration section in the workflow file: <configuration>
.....
<property>
<name>oozie.launcher.mapreduce.map.java.opts</name>
<value>-verbose</value>
</property>
</configuration> Step 2.Upload the updated workflow file to workflow folder defined by oozie.wf.application.path in job property file Step 3.Submit the workflow and now you should be able to see verbose log for the Oozie launcher such as class loading information . [Loaded java.lang.ExceptionInInitializerError from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded org.apache.commons.logging.impl.LogFactoryImpl$2 from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar]
[Loaded org.apache.commons.logging.impl.LogFactoryImpl$1 from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar]
[Loaded org.apache.commons.logging.Log from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar]
[Loaded org.apache.commons.logging.impl.Log4JLogger from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar]
[Loaded org.apache.log4j.spi.AppenderAttachable from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar]
[Loaded org.apache.log4j.Category from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar]
[Loaded org.apache.log4j.Logger from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar]
[Loaded org.apache.log4j.Priority from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar]
[Loaded org.apache.log4j.Level from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar]
[Loaded java.lang.InstantiationError from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded sun.reflect.UnsafeFieldAccessorFactory from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded sun.reflect.UnsafeQualifiedStaticFieldAccessorImpl from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded sun.reflect.UnsafeQualifiedStaticObjectFieldAccessorImpl from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded java.util.HashMap$EntrySet from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded java.util.HashMap$HashIterator from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded java.util.HashMap$EntryIterator from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded java.util.MissingResourceException from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
[Loaded org.apache.log4j.LogManager from file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar]
[Loaded java.net.MalformedURLException from /usr/jdk64/jdk1.8.0_60/jre/lib/rt.jar]
... View more
Labels:
03-24-2017
11:09 PM
PROBLEM DESCRIPTION The Oozie service check fails and the following error message is displayed in Ambari: stderr: /var/lib/ambari-agent/data/errors-12523.txt Python script has been killed due to timeout after waiting 300 secs
There is no error in the stdout.
The service check is terminated because the timeout (300 secs by default) is reached. CAUSE This issue occurs when the time taken by Ambari to upload jar and workflow files to hdfs for Oozie service check takes longer than the timeout mentioned in the server settings. Example: Nodes in a cluster are configured on ipv4 proxy, which causes network slowness among nodes. Ambari uploads jar files and workflow files to the hdfs for Oozie service check, depending on network performance time required to upload these files exceeds 300 secs timeout. WORKAROUND To increase the timeout, find and update the timeout set in the metainfo.xml file located in Edit /var/lib/ambari-server/resources/common-services/OOZIE/your_version_number/metainfo.xml, RESOLUTION Improve the network performance, so that the Oozie service check can be finished in 300 secs timeout period.
... View more
Labels:
03-24-2017
10:43 PM
To configure Oozie workflow after enabling the ResourceManager HA, do the following: Find the YARN Resource Manager cluster id. To find the cluster id, Go to Ambari > YARN > configs Search for property yarn.resourcemanager.cluster-id After finding the cluster id, at the Job Property file, provide the YARN Resource Manager cluster id to the job _Tracker like the following: jobTracker= yarn-ha Submit the Oozie job using the modified job property file.
... View more
Labels: