Member since
08-03-2018
26
Posts
2
Kudos Received
2
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 14269 | 05-30-2023 07:29 PM | |
| 8065 | 05-02-2019 05:18 AM |
05-30-2023
07:29 PM
2 Kudos
Hi @steven-matison The issue got fixed after making below 2 changes in /etc/krb5.conf file 1. Issue was a include line in my /etc/krb5.conf file which was not valid. Removed below from /etc/krb5.conf "includedir /etc/krb5.conf.d/" 2. On macOS the default client does not fall back to TCP. In krb5.conf prefix kdc value with tcp/ to force the client to use TCP if your corporate network blocks UDP. kdc = tcp/kdc.example.com:88 Regards, Banshi.
... View more
05-30-2023
06:44 AM
Hi @steven-matison You are right. I have replaced actual REALM with EXAMPLE.COM while posting. I have checked connectivity using "nc -zv" and "ping" command. Connectivity is fine. ==== nc -zv <kdc_server_VIP> <KDC_Port> Connection to xxxxxxxx port xxxxxx [tcp/sqlexec] succeeded! ==== --- <kdc_server_VIP> ping statistics --- 13 packets transmitted, 12 packets received, 7.7% packet loss ==== "kinit: krb5_get_init_creds: unable to reach any KDC in realm EXAMPLE.COM, tried 0 KDCs" By seeing the above error, I feel it's not able to locate the krb5.conf file. When we run kinit command, is it referring to /etc directory for krb5.conf file or some other location in Mac machine? Regards, Banshi.
... View more
05-30-2023
03:48 AM
Hi Team, I want to setup kerberos client in my Mac Laptop having MacOS Monterey (version 12.6.5). I have put the krb5.conf file at below paths. /etc/krb5.conf /Library/Preferences/edu.mit.kerberos But when I try to run kinit, i get gelow error. -- kinit -kt /Users/banshidhar_sahoo/Desktop/POC_KEYTAB/test.headless.keytab test@EXAMPLE.COM kinit: krb5_get_init_creds: unable to reach any KDC in realm EXAMPLE.COM, tried 0 KDCs -- I have also set the ENV Variable as below: KRB5_CONFIG=/etc/krb5.conf But getting same error while doing kinit. Can you please suggest how to point to krb5.conf so that it can reach out to the correct kdc server. Regards, Banshi.
... View more
Labels:
- Labels:
-
Kerberos
02-16-2022
07:46 PM
Hi All, I want to append cluster name in the 'Cloudera Alert mail subject'. I am not getting any configuration for same. Please suggest how to do that. Current Subject: [Cloudera Alert] 4 Alerts since 2:31:41 AM Want to change as below (assuming cluster name as ABC): [Cloudera Alert - ABC ] 4 Alerts since 2:31:41 AM Regards, Banshi.
... View more
Labels:
- Labels:
-
Cloudera Manager
09-07-2020
06:17 AM
Hi @GangWar, As per CDH package information, Impala 3.2 comes with CDH 6.3.x. Not with CDH 5. Refer below details. ====== CDH 5.x comes with Impala 2.x. https://docs.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh_package_tarball_515.html#cm_vd_cdh_package_tarball_515 CDH 6.3.x comes with Impala 3.2. https://docs.cloudera.com/documentation/enterprise/6/release-notes/topics/rg_cdh_63_packaging.html#concept_rtm_b5p_m3b ====== Can you elaborate how to upgrade impala from 2.11 to 3.2 in CDH 5.14. Regards, Banshi.
... View more
09-07-2020
03:51 AM
Hi All,
Good Afternoon.
We have CDH 5.14 and Impala 2.11 in one of our Cloudera cluster.
We have a requirement to upgrade Impala to 3.2 in the same cluster.
Please suggest us the best way to achieve this.
Can we just upgrade Impala without upgrading CDH.
Regards,
Banshi.
... View more
Labels:
05-02-2019
05:18 AM
Hi @bgooley , Thank you for your suggestion. We set the same property and it fixed the issue. Cloudera Manager --> YARN --> searched for: Gateway Client Environment Advanced Configuration Snippet (Safety Valve) for hadoop-env.sh and added this: HADOOP_CLIENT_OPTS="-Djava.io.tmpdir=/ngs/app/$(whoami)/hadoop/tmp" Regards, Banshi.
... View more
04-30-2019
05:02 AM
Hi Team, We are facing an issue where mapreduce jobs are failing with below error. Distribution: Cloudera (CDH 5.14.4) ==== Exception in thread "main" java.io.FileNotFoundException: /tmp/hadoop-unjar5272208588996002870/org/apache/hadoop/hive/metastore/api/ThriftHiveMetastore$AsyncClient$revoke_privileges_call.class (No space left on device) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:162) at org.apache.hadoop.util.RunJar.unJar(RunJar.java:110) at org.apache.hadoop.util.RunJar.unJar(RunJar.java:81) at org.apache.hadoop.util.RunJar.run(RunJar.java:214) at org.apache.hadoop.util.RunJar.main(RunJar.java:141) ==== We figured out that MR framework tries to keep temporary class files under /tmp dir with a folder named hadoop-unjarxxxxxx. As there is not enough space in /tmp to hold the class files, it fails with above error. We want to change the default location for hadoop-unjarxxxxxx directory. We changed the default value of hadoop.tmp.dir in core-site.xml (safety valve) to /ngs/app/${user.name}/hadoop/tmp , but it’s of no help. Any suggestion on how to change the default hadoop-unjar location from /tmp to something else. Regards, Banshi.
... View more
Labels:
- Labels:
-
Cloudera Manager
-
MapReduce