Member since
03-15-2018
27
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
219 | 05-16-2018 07:51 AM |
09-04-2019
09:22 AM
I also met theses errors: Can't get Kerberos realm
Cannot locate default realm It was due to the quotes on parameter -Djava.security.krb5.conf. I finally manage to connect my DBeaver to Hive with Kerberos SSL. My final dbeaver.ini config was: --startup
plugins/org.eclipse.equinox.launcher_1.5.400.v20190515-0925.jar
--launcher.library
plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.1.1000.v20190125-2016
-vmargs
-XX:+IgnoreUnrecognizedVMOptions
--add-modules=ALL-SYSTEM
-Xms64m
-Xmx1024m
-Djavax.security.auth.useSubjectCredsOnly=false
-Dsun.security.krb5.debug=true
-Djava.security.krb5.conf=/etc/krb5.conf
-Djava.security.auth.login.config=/home/matthieu/jaas.conf With jaas.conf like that: Client {
com.sun.security.auth.module.Krb5LoginModule required
debug=true
doNotPrompt=true
useKeyTab=true
keyTab="/path/to/user.REALM.keytab"
useTicketCache=true
renewTGT=true
principal="user@REALM"
;
}; and JDBC url: jdbc:hive2://{host}:{port}/{database};KrbRealm=MY_REALM;principal=hive/{host}@MY_REALM;ssl=true;sslTrustStore=/path/to/trustore;transportMode=http;httpPath=cliservice;trustStorePassword=changeit
... View more
07-25-2018
09:46 AM
We must perform Namenode HA before hdfs federation. What are the advantages and conceptual reasons for this?
... View more
Labels:
07-23-2018
01:54 PM
Files View could not open after HDFS Federation. Is setting up separate views for each nameservices respectively the only solution? Also, after creating the directory, do we have to mount it always?
... View more
Labels:
05-31-2018
03:19 PM
Yes, that's required for PAM authentication to work. Happy to help.
... View more
06-02-2018
12:39 PM
Well, the configuration files were correct, but the environment was not set properly. Checked hbase env on both nodes and found a difference. Update with the following properties in ambari and it worked: export LD_LIBRARY_PATH=::/usr/hdp/2.6.3.0-235/hadoop/lib/native/Linux-amd64-64:/usr/lib/hadoop/lib/native/Linux-amd64-64:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.6.3.0-235/hadoop/lib/native
export HADOOP_HOME=/usr/hdp/2.6.3.0-235/hadoop
export HADOOP_CONF_DIR=/usr/hdp/2.6.3.0-235/hadoop/etc/hadoop
... View more
03-18-2019
12:00 PM
you r the best
... View more
05-16-2018
07:51 AM
Well, the issue has been solved. It seems like a bug in HDP 2.6. After setting up one-way trust, you need to remove [domain_realm] and [capaths] from your krb5.conf. Also, check for spnego keytabs that they are properly created with entries for all encryption types and are present on every node.
... View more
04-06-2018
07:31 AM
I am installing Apache Griffin on my HDP cluster. I followed steps on this link . Apache Griffin requires spark URI link to submit the spark jobs. # spark-admin
# spark.uri=http://10.149.247.156:28088
# spark.uri=http://10.9.246.187:8088 What link should I set?
... View more
Labels:
04-23-2018
12:36 PM
@Rajkumar Singh I did the same thing but I'm getting either HTTP Error 401 or 404 or certificate error. The cluster I'm testing this on is also Kerberized.
... View more