Member since
08-01-2013
187
Posts
10
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2213 | 09-01-2016 09:26 PM | |
1903 | 10-14-2015 07:31 AM | |
2058 | 06-21-2015 06:02 PM | |
3736 | 02-26-2015 04:36 PM | |
4220 | 02-18-2015 12:18 AM |
09-05-2017
07:04 AM
2 Kudos
Symptoms A Spark job fails with INTERNAL_FAILURE. In the WA (Workload Analytics) page of the job that failed, the following message is reported: org.apache.spark.SparkException: Application application_1503474791091_0002 finished with failed status
Diagnosis As the Telemetry Publisher didn't retrieve the application log due to a known bug, we have to diagnose the application logs (application_1503474791091_0002) directly, which are stored in the user's S3 bucket. If the following exception is found, it indicates that the application failed to resolve a dependency in the Hadoop class path: 17/08/24 13:13:33 INFO ApplicationMaster: Preparing Local resources Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.tracing.TraceUtils.wrapHadoopConf(Ljava/lang/String;Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/htrace/core/HTraceConfiguration; at org.apache.hadoop.fs.FsTracer.get(FsTracer.java:42) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:687) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:671) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:155) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2653) This most likely occurred because the jar may have been built using the another Hadoop distribution's repository, for example EMR (Amazon Elastic MapReduce)
Solution To resolve the issue, rebuild the application using the CDH repository, https://repository.cloudera.com/artifactory/cloudera-repos/, using Maven or sbt. The example of using Maven is as follows. https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh5_maven_repo.html
... View more
06-21-2017
02:11 PM
3 Kudos
Question Where does Workload Analytics ingest user workloads into and analyze it?
Answer Workload Analytics runs as part of Cloudera Altus functionality on an environment operated by Cloudera. Telemetry Publisher, part of Cloudera Manager installation, sends user's workload to the environment as soon as a job ends and analyze it. Thereon the result shows up in the Cloudera Altus UI. https://www.cloudera.com/documentation/altus/topics/wa_overview.html
... View more
Labels:
06-21-2017
12:57 PM
1 Kudo
Question Is it possible to tune the thresholds for Health Check which Workload Analytics does based on user's requirements?
Answer No, it's not possible to tune the thresholds. Health Check uses predefined thresholds which are described in the document: https://www.cloudera.com/documentation/altus/topics/wa_analyze_jobs.html
... View more
09-04-2016
06:23 PM
Hi, Are you sure that the blocks are still existing in the DataNode hosts even after rebooting the instances? By default, the location should be under /dfs/dn{1,.2..}.
... View more
09-01-2016
09:26 PM
1 Kudo
Such custom jar file that HBase uses cannot be distributed across the hosts by CM automatically. You have to locate by yourself. HTH. -- Sent from my mobile
... View more
08-24-2016
08:04 PM
Hi, That WARN message was introduced by an improvement per HDFS-9260. However, as our backport missed sorting, the WARN appears. It already gets addressed in: https://github.com/cloudera/hadoop-common/commit/95c7d8fbe122de617d11b6e4ea7d101803d0bd12 and the fix is available in CDH releases of 5.7.2 onwards and also in the 5.8.x series.
... View more
08-23-2016
09:33 PM
Even you hit a single host crash, the corresponding blocks are replicated in the other hosts. That is, the hfile in the HDFS level should be safe. Otherwise, are you running the cluster as standalone mode? Thanks, Dice. -- Sent from my mobile
... View more
12-02-2015
02:44 AM
Hi, I'm unsure if your /etc/hosts is the real one, but ensure that you meet all the requirements under "Networking and Security Requirements" in the following guide. http://www.cloudera.com/content/www/en-us/documentation/enterprise/latest/topics/cm_ig_cm_requirements.html Also it looks you're enabling single user mode per "Because agent not running as root". Is this correct? Have you followed the guide below? http://www.cloudera.com/content/www/en-us/documentation/enterprise/latest/topics/install_singleuser_reqts.html
... View more
10-14-2015
07:31 AM
1 Kudo
Hi, Please note that Cloudera Search is being included in CDH 5. As the CDH 5.4.3 parcel looks already being activated, you can simply use it via "Add services" from the CM home page.
... View more
06-21-2015
06:02 PM
Are you using Cloudera Manager 5.4 and higher? If you're still on 5.2 or 5.3, Kafka CSD needs to be downloaded per http://www.cloudera.com/content/cloudera/en/documentation/cloudera-kafka/latest/topics/kafka_installing.html, thereon see the parcels being available.
... View more