Member since
11-14-2017
18
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4953 | 03-21-2018 05:48 AM |
07-24-2018
02:57 AM
No, we couldn't resolve it, but we managed a workarround using flume and flafka
... View more
07-23-2018
11:15 PM
keytab file was generated on server side. Anyway the issue is solved, the machine where the ODBC client was being developed had /etc/krb5.conf and /etc/hosts both wrong. Anyway thank you for your reply 🙂
... View more
03-21-2018
05:48 AM
1 Kudo
Hello everybody, The settings we provided the app developers in the client side was right, they had some problem with /etc/hosts configuration making them to not authenticating properly to the KDC server, also, they had a bad /etc/krb5.conf. With that I close this topic
... View more
03-20-2018
08:01 AM
Hello,
We have a kerberized clúster and a user of us tells that the next setting on the ODBC for Impala is not working
impala <- src_impala(
drv = drv,
driver = "/opt/cloudera/impalaodbc/lib/64/libclouderaimpalaodbc64.so",
host = "hostname",
port = 21050,
database = "publication",
KrbRealm = "CLOUDERA",
KrbFQDN = "hostname",
KrbServiceName = "impala",
AuthMech = 1,
UseKeytab = 1,
UPNKeytabMappingFile = "/home/gentrif/talend.keytab"
)
Looking into the documentation I told them to use the next setting
impala <- src_impala(
drv = drv,
driver = "/opt/cloudera/impalaodbc/lib/64/libclouderaimpalaodbc64.so",
host = "hostname",
port = 21050,
database = "publication",
KrbRealm = "CLOUDERA",
KrbFQDN = "hostname",
KrbServiceName = "impala",
AuthMech = 1,
UseKeytab = 1,
DefaultKeytabFile = "/home/gentrif/talend.keytab"
uid = "principal"
)
As the user tells us this last configuration is not working for their application.
Maybe we are forgeting something important.
Thank you in advance
... View more
Labels:
- Labels:
-
Apache Impala
-
Kerberos
03-20-2018
01:53 AM
2 Kudos
Hello, We have configured Kafka with SASL_PLAINTEXT authentication protocol, someone wants to ingest into Kafka using username and password, they say it's the only way to authenticate from their KafkaClient. Cloudera documentation only gives principal/keytab configuration. Is there a way to configure Kafka to allow multiple authenticating configuration, as in https://docs.confluent.io/3.0.0/kafka/sasl.html#enabling-multiple-sasl-mechanisms-in-a-broker ?? Thank you
... View more
Labels:
- Labels:
-
Apache Kafka
12-19-2017
01:14 AM
Didn't find there. Do you know where I can download the RPM you mentioned? Thank you
... View more
12-15-2017
12:59 AM
@leary Do you know where the jar is placed in a 5.12.2 installation? Thank you
... View more
12-13-2017
04:11 AM
Hello Community, I've been trying to execute the next step in the documentation. https://www.cloudera.com/documentation/enterprise/5-12-x/topics/sg_hdfs_sentry_sync.html I have done every step on Enabling the HDFS-Sentry Plugin Using Cloudera Manager and stepped into an error on starting HDFS Service that HDFS didn't find Sentry Plugin Class, that got solved by executing # yum install sentry-hdfs-plugin I managed, then, to start HDFS Service, but then Hive Metastore Server stopped with the next error. [main]: Metastore Thrift Server threw an exception...
MetaException(message:Failed to instantiate listener named: org.apache.sentry.binding.metastore.SentryMetastorePostEventListener, reason: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.sentry.hdfs.MetastorePlugin not found)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.getMetaStoreListeners(MetaStoreUtils.java:1514)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:555)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6313)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6308)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:6558)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:6485)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) I was looking for some answer as it seems to be an easy problem, but I found nothing related so I think it will be interesting to have it here to help others. Now I am working with Cloudera Trial Enterprise Virtual Machine, before passing this to our production environment with more hosts, so I am trying to prevent most of the errors here before passing to the new scenario. Thank you
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sentry
-
HDFS
12-13-2017
04:00 AM
We surpassed the error just configuring Cloudera to authenticate to a local KDC, we were using a KDC provided by WSO2, this problem got solved but not with the scenario it appeared first.
... View more
11-30-2017
12:08 AM
Hi again @saranvisa, I checked the logs and I saw that the error that I was getting on starting a service was caused from a certain process so I got in that directory and looked for the error on hdfs.keytab. When doing the klist -kt hdfs.keytab I got the principals list, tried to make a kinit with one of them and it worked well. What I've seen is that the imported keytabs I was trying to klist were some old keytab files, modified few weeks ago, and the logs gave me the clue on which directory test the keytab files. So we are at the same point, seems that krb5-workstation commands work fine, keytabs were generated right and the service keeps outputing the same error again and again. Some more ideas to test? Thank you
... View more
11-29-2017
11:48 PM
Hello @saranvisa I tested it again after doing a regenerate keytabs and when doing the klist -kt I got the next message. # klist -kt hdfs.keytab
Keytab name: FILE:hdfs.keytab
klist: Unsupported key table format version number while starting keytab scan This is not the same for other keytab files in other directories into /var/run/cloudera-scm-agent/process just for some of them. Any idea of what's happening? Why some processes are getting empty keytab files? I don't understand. Thank you for the help
... View more
11-29-2017
08:01 AM
@saranvisa Yes I've done all the steps in multiple ocasions, kinit command works fine with the keytabs imported but HDFS continues writing that error in logs.
... View more
11-28-2017
11:15 PM
Sorry, worked the kinit command, Cloudera keeps giving the first mentioned exception.
... View more
11-28-2017
08:17 AM
Sorry for the late response. I did that and it worked with kinit and an imported keytab Thank you
... View more
11-15-2017
05:42 AM
Yes, the realm name is in uppercase, the same as in the examples
... View more
11-15-2017
12:02 AM
Hello, I've got a problem with the authentication of Kerberos using the Keytab, when I try to start any instance of HDFS service I keep getting the next error. org.apache.hadoop.security.KerberosAuthException: Login failure for user: hdfs/<fqdn>@<REALM.COM> from keytab hdfs.keytab javax.security.auth.login.LoginException: Message stream modified (41) I did not found any satisfactory answer for this problem, and the principals authenticates very well using that keytab file through kinit command. Thank you in advance.
... View more