Member since
09-30-2020
5
Posts
0
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5613 | 11-30-2020 02:51 AM | |
2040 | 10-04-2020 06:33 AM |
11-30-2020
02:51 AM
I myself realized that it's not an issue with log4j2. Basically, the command was getting timed out after 270 seconds as it was set default. So, I have increased the value of below property to 500 and that resolved the issue.. Thanks 🙂 Oozie > Configuration > Search for "Oozie Upload ShareLib Command Timeout" and update the value to 500.
... View more
11-30-2020
12:39 AM
Hi, I was trying to install Cloudera Manager 6 with CDH 6.3.0 for a 3 node cluster. I have downloaded and installed cloudera-manager-installer.bin in CentOS6 and installed cloudera server and agent. Then I have logged into CM using admin:admin to configure cluster and I stuck here at Add Cluster Configuration step as Upload Oozie ShareLib step failed with no log4j2 file found. PFB logs. Upload Oozie ShareLib Also, I have checked in /var/log/cloudera-scm-server/cloudera-scm-server.log and found the same error message there as well. Is this log4j2 something that I need to install manually before configuring cluster ? If so, please suggest me on how to do that. stdout and stderr for reference. PFB!! stdout: Mon Nov 30 08:10:40 UTC 2020
JAVA_HOME=/usr/java/jdk1.8.0_181-cloudera
using 6 as CDH_VERSION
CONF_DIR=/var/run/cloudera-scm-agent/process/61-oozie-OOZIE-SERVER-upload-sharelib
CMF_CONF_DIR=
Found Hadoop that supports Erasure Coding. Trying to disable Erasure Coding for path: /user/oozie/share/lib
Done
the destination path for sharelib is: /user/oozie/share/lib/lib_20201130081046
Running 1814 copy tasks on 8 threads stderr: org.apache.oozie.tools.OozieSharelibCLI create -fs hdfs://master.us-central1-a.c.karthikproject.internal:8020 -locallib /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/oozie/oozie-sharelib-yarn -concurrency 8
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/slf4j-simple-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. Set system property 'org.apache.logging.log4j.simplelog.StatusLogger.level' to TRACE to show Log4j2 internal initialization logging.
[30/Nov/2020 08:15:14 +0000] 1740 MainThread redactor INFO Killing with SIGTERM
... View more
Labels:
- Labels:
-
Cloudera Manager
10-04-2020
06:33 AM
Thanks for your response @tjangid This was something informational but unfortunately this is not what I was looking for. Anyways, since I didn't find any API earlier, I have used urllib2 command and connected to History Server directly and then used some regexp's to extract the required content which is (hive.access.subject.name)
... View more
09-30-2020
09:29 AM
Hi Team, I want the list of all running yarn/spark applications along with the actual user name for the queries running with user as hive. 1) I was aware that we can use yarn application -list to get the list of running applications and 2) hive.access.subject.name in the configuration page of yarn history server/resource manager. But for point#2, I need to manually check it for each application which is not a viable solution. So, I was trying to check if there is any API available for YARN & SPARK history server to get the actual user of the query/application. (OR) It would great if we have any command to get the actual user name for a running application. Kindly let me know if any additional information required.
... View more
Labels: