Member since
10-01-2018
802
Posts
143
Kudos Received
130
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3072 | 04-15-2022 09:39 AM | |
| 2474 | 03-16-2022 06:22 AM | |
| 6552 | 03-02-2022 09:44 PM | |
| 2907 | 03-02-2022 08:40 PM | |
| 1914 | 01-05-2022 07:01 AM |
09-29-2020
01:16 AM
1 Kudo
@rjpsarath2 Please read carefully the section below on same page: What external websites will I have access to while taking the exam? The exam gateway has a firewall that restricts your access to external resources. You will have whitelisted access to the Apache and Cloudera websites. Other sites we determineto be necessary and appropriate for a given exam may also be made available. Websites like Google, StackOverflow etc. and other search functionality will not be accessible. Downloading external code or packages is not permitted. Where Apache and Cloudera websites means below Websites: Apache: https://hadoop.apache.org/docs/stable/ Cloudera: https://docs.cloudera.com/documentation/index.html Regarding the tools, as far as I know during the exams you have been given a environment where you have to perform the task so you can use any commands available in that OS or related with environment. If you are expecting using a tool out side of exam window then it's not. As the exam guide clearly saying: You may not launch any other application on your computer during the exam You may not take screenshots of the exam environment Hope this help. Let me know if you still have any question.
... View more
09-28-2020
12:49 PM
@rjpsarath2 This FAQ page will resolve your most of the queries. Please check below: https://www.cloudera.com/about/training/certification/faq.html
... View more
09-28-2020
12:47 PM
1 Kudo
@b1995assuncao you can do that from the top right corner and select custom range. Hope below screenshot will help you.
... View more
09-28-2020
12:02 PM
@HanzalaShaikh You can see the audit hadoop-cmf-hdfs-NAMENODE-namenode1.us-east1-b.c.coherent-elf-271314.internal.log.out file. Try to clean start the HDFS again and see if that helps.
... View more
09-28-2020
02:42 AM
1 Kudo
@muslihuddin Yes, that's right.
... View more
09-28-2020
01:35 AM
2 Kudos
@muslihuddin The CM server down situation will not affect any HDFS data. For your specific use case you need: 1. A backup of CM DB (NOTE: This is most important to get your old configuration back) 2. Then you can install CM server on another hosts and restore the old CM DB backup and you should be able to monitor the old cluster again. You old monitoring data will not be available as that is still resides on different node so you will loose that at this point means old info in charts and dashboard will be not available.
... View more
09-25-2020
02:59 AM
@mohammad_shamim You might be hitting a known bug with Java recent versions (OpenJDK 1.8u242 or JDK 11.0.6). TSB-394. To resolve this issue, take the following action on all impacted nodes solved as appropriate for the environment. Edit java.security file located in the active JDK on the clusters. Add or alter sun.security.krb5.disableReferrals parameter, to ensure that the following is set to true: sun.security.krb5.disableReferrals=true
... View more
09-24-2020
01:22 AM
@Yuriy_but You are likely hitting some known issue. If this is greater than CM 5.15.1 then please follow this step and see. From Cloudera Manager, stop Navigator Metadata Server On the host where Navigator Metadata Server is installed, navigate to the jars directory: $ cd /usr/share/cmf/cloudera-navigator-server/jars List the contents that start with c3p0: $ ls c3p0* The list may include two jar files: c3p0-0.9.1.2.jar
c3p0-0.9.5.2.jar If the file c3p0-0.9.5.2.jar is present, remove it: $ rm c3p0-0.9.5.2.jar Navigate tothe Navigator wars directory: $ cd /usr/share/cmf/cloudera-navigator-server/wars/ Delete the same file c3p0-0.9.5.2.jar from the nav-core-webapp* war file: $ zip -d nav-core-webapp-2.14.1.war WEB-INF/lib/c3p0-0.9.5.2.jar From Cloudera Manager, start Navigator Metadata Server If this is air-gapped environment (No internet access) then you might need these: 1. Identify IP address of the NMS. (can be gathered in Cloudera Management services->instances->Navigator Metadata Server) 2. Add the following line to the bottom of /etc/hosts on the NMS 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6 <NMS_IP_ADDRESS> java.sun.com 3. On NMS, install httpd, depending on your Linux distribution. As this host does not have internet access, you will need to manually import and install rpms. https://httpd.apache.org/docs/2.4/platform/rpm.html 4. Make a directory for java dtd file, and adjust permissions. mkdir /var/www/html/dtd; chown 755 /var/www/html/dtd 5. Acquire Java DTD file from the following URL. http://java.sun.com/dtd/web-app_2_3.dtd 6. Copy the file to /var/www/html/dtd directory on the NMS. Ensure the file is readable by "other". 7. Enable httpd server sudo systemctl enable httpd; sudo systemctl start httpd 8. Start NMS. NOTE: httpd must run for each NMS restart, this is not a one-off workaround. Systemctl command should ensure that this is the case. If the above workaround is not possible, a patch is also available from Cloudera Support until this is resolved in an upcoming release of Cloudera Manager.
... View more
09-23-2020
03:41 PM
@bsoliveiram That's seems a typo in doc check this latest doc: https://docs.cloudera.com/documentation/enterprise/latest/topics/spark_mllib.html
... View more
09-23-2020
01:06 PM
@bsoliveiram I don't see any tables talking about CDH version. Can you check again your question. OS Package Name Package Version RHEL 7.1 libgfortran 4.8.x SLES 11 SP3 libgfortran3 4.7.2 Ubuntu 12.04 libgfortran3 4.6.3 Ubuntu 14.04 libgfortran3 4.8.4 Debian 7.1 libgfortran3 4.7.2
... View more