Member since
12-16-2016
10
Posts
1
Kudos Received
0
Solutions
08-22-2017
05:28 AM
Hi Raju, We have mysql HA setup with Master-Master , using galera . And it is 3 node replication among the. I have no question on it If we want to use above setup , 1) do we have to setup a vip that resloved to all the 3 nodes. And use the vip in HDP -- Hive configuration ( hive: Database URL ( javax.jdo.option.ConnectionURL ), like jdbc:mysql://mysql-vip/hive or 2) without vip , can we mentin all the mysql host names like : jdbc:mysql://node1,node,node3/hive , while configuring hive: Database URL ( javax.jdo.option.ConnectionURL Also do we need to use mysql jdbc driver or MariaDB jdbc Driver? , with above choice? Thanks Naveen.
... View more
08-21-2017
07:31 PM
Hi , We have MYSQL Galera, which is Master Master HA, configured on 3 nodes. While configuring for hive: Database URL ( javax.jdo.option.ConnectionURL ) , what should the configuration look like? Case 1) can we setup the one VIP (for all these 3 nodes ) and use MariaDB JDBC Drive? or Case 2) Do we need to mention all these 3 nodes like jdbc:mysql://node1,node,node3/hive What should be the similar configuration for Ranger and Ambari?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
-
Apache Ranger
07-13-2017
08:39 PM
Hi ,
I am trying to setup Kerberos on HA enabled cluster , using Ambari GUI
GUI keep on saying : "Invalid KDC administrator credentials. Please enter admin principal and password."
ambari-server.log , show below error message
Jul 2017 19:43:25,469 ERROR [ambari-client-thread-34] KerberosHelperImpl:1861 - Cannot validate credentials: org.apache.ambari.server.serveraction.kerberos.KerberosAdminAuthenticationException: Invalid KDC administrator credentials. The KDC administrator credentials must be set as a persisted or temporary credential resource.This may be done by issuing a POST (or PUT for updating) to the /api/v1/clusters/:clusterName/credentials/kdc.admin.credential API entry point with the following payload: { "Credential" : { "principal" : "(PRINCIPAL)", "key" : "(PASSWORD)", "type" : "(persisted|temporary)"} } } 13 Jul 2017 19:43:25,469 ERROR [ambari-client-thread-34] BaseManagementHandler:67 - Bad request received: Invalid KDC administrator credentials. The KDC administrator credentials must be set as a persisted or temporary credential resource.This may be done by issuing a POST (or PUT for updating) to the /api/v1/clusters/:clusterName/credentials/kdc.admin.credential API entry point with the following payload: { "Credential" : { "principal" : "(PRINCIPAL)", "key" : "(PASSWORD)", "type" : "(persisted|temporary)"} } }
AS per the : https://community.hortonworks.com/articles/42927/adding-kdc-administrator-credentials-to-the-ambari.html , I successfully implemented belwo steps.
1) ambari-server setup-security
2) curl -H "X-Requested-By:ambari" -u admin:admin -X POST -d '{ "Credential" : { "principal" : "kadmin", "key" : "kadmin", "type" : "persisted" } }' http://ambari01.dev.dataquest.com:8080/api/v1/clusters/dev_cluster/credentials/kdc.admin.credential
3) curl -H "X-Requested-By:ambari" -u admin:admin -X GET http://ambari01.dev.dataquest.com:8080/api/v1/clusters/dev_cluster/credentials/kdc.admin.credential
Still having the problem
Below are my input in Ambari / Kerberos GUI setup
KDC HOST : kdc.dev.dataquest.com
Realm Name : DEV.DATAQUEST.COM
LDAP URL : ldaps://dev.dataquest.com:636
Container DN : OU=service-accounts,OU=core,dc=dev,dc=dataquest,dc=com
Domains: dev.dataquest.com,.dev.dataquest.com
Kadmin Host : kdc.dev.dataquest.com
Admin principal: kadmin
Admin password : kadmin
***********
I also tried with Admin principle as kadmin@DEV.DATAQUEST.COM . Still no luck.
ldapsearch : command wokrs fine
Can you please suggest the resolution
Thanks
Naveen
... View more
Labels:
- Labels:
-
Apache Ambari
03-15-2017
04:12 AM
Santhosh, I see error : User:amb_ranger_admin credentials on Ambari UI are not in sync with Ranger though , i sync password ( to same password) , Changes / new passwords are not are not getting saved. Thanks Naveen
... View more
03-12-2017
08:05 AM
Hi Eddie, I am not using self-signed Certificate. But using CA Certificate. As per the link : https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_security/content/ch_obtain-trusted-cert.html : 4.1. Obtain a Certificate from a Trusted Third-Party Certification Authority (CA) can you please clarify below situation If we have 6 service accounts ( like hive ,base , hdfs , oozie ...) and say it is 10 node cluster, 1) Do we need to generate 6 X 10 , java keystore file ( .jsk) 2) And for each of this .jsk file , we need create thier respective CSR file . In this example there will be 60 .csr fils 3) Submit these 60 .csr to CA team to get individual (60) , certificate with extension .pem. 4) And all these 60 .pem need to import to truststore , on 10 individual nodes? Thanks Naveen.
... View more
03-12-2017
07:21 AM
I am getting : User:amb_ranger_admin credentials on Ambari UI are not in sync with Ranger followed the link : https://community.hortonworks.com/questions/19948/this-alert-is-used-to-ensure-that-the-ranger-admin.html , still no luck. Ranger version : 0.6.0.2.5 HDP : 2.5.3.0 could able to login to Ranger Admin UI , using both amb_ranger_admin and admin , with their respective passwords. From Ambari, i tired to change admin password to match with amb_ranger_admin's password . But could not modify. Also tired to modify amb_ranger_admin to match with admin's password. It also could not modify. I bounced both Ranger and Ambari also. Still no lock. Do we need to modify these password , from mysql? or After login to Ranger Admin UI , as both amb_ranger_admin and admin , users , can we update their password to same value? Any Help is appreciated. Thanks Naveen.
... View more
Labels:
- Labels:
-
Apache Ranger
03-11-2017
10:24 PM
Updated the user admin , password under Advanced ranger-ev to Same password of amb_ranger_admin password . ( both ranger_admin_password and admin_password are same ) . Still No luck.
... View more
03-08-2017
06:40 PM
Thank Eddie, It helped me a lot. I have a followup question. After we create the java keystore file ( .jsk) using keytool -genkey ... do we need to export it to the certificate .cer file , so that it can be added to the truststore (/usr/lib/jvm/java-1.8.0-openjdk/jre/lib/security/cacerts) as the trusted certificate ? ********** I see below error from beeline Error: Could not open client transport with JDBC Uri: jdbc:hive2://hadoop-node1.sandbox.com:10000/testdb;ssl=true;sslTrustStore=/usr/lib/jvm/java-1.8.0-openjdk/jre/lib/security/hivekeystore.jsk;trustStorePassword=changeit: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection? (state=08S01,code=0) in hiveverver2 log file i see below error. Caused by: org.apache.thrift.transport.TTransportException: Error validating the login at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316) at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41) at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216) ... 4 more 2017-03-08 18:06:28,807 ERROR [HiveServer2-Handler-Pool: Thread-71]: transport.TSaslTransport (TSaslTransport.java:open(315)) - SASL negotiation failure javax.security.sasl.SaslException: Error validating the login [Caused by javax.security.sasl.AuthenticationException: LDAP Authentication failed for user [Caused by javax.naming.AuthenticationException: [LDAP: error code 49 - 80090308: LdapErr: DSID-0C0903A9, comment: AcceptSecurityContext error, data 52e, v1db1^@]]] at org.apache.hive.service.auth.PlainSaslServer.evaluateResponse(PlainSaslServer.java:109) at org.apache.thrift.transport.TSaslTransport$SaslParticipant.evaluateChallengeOrResponse(TSaslTransport.java:539) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:283) at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41) at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: javax.security.sasl.AuthenticationException: LDAP Authentication failed for user [Caused by javax.naming.AuthenticationException: [LDAP: error code 49 - 80090308: LdapErr: DSID-0C0903A9, comment: AcceptSecurityContext error, data 52e, v1db1^@]] at org.apache.hive.service.auth.LdapAuthenticationProviderImpl.Authenticate(LdapAuthenticationProviderImpl.java:185) at org.apache.hive.service.auth.PlainSaslHelper$PlainServerCallbackHandler.handle(PlainSaslHelper.java:106) at org.apache.hive.service.auth.PlainSaslServer.evaluateResponse(PlainSaslServer.java:102) Thanks Naveen
... View more
03-03-2017
11:28 PM
1 Kudo
Hi , I am receiving below error , while create the hive repo using Ambari and running test connection Connection Failed. Unable to retrieve any files using given parameters, You can still save the repository and start creating policies, but you would not be able to use autocomplete for resource names. Check ranger_admin.log for more info. org.apache.ranger.plugin.client.HadoopException: Unable to login to Hadoop environment [jaguar_cluster_hive]. Unable to login to Hadoop environment [cluster_hive]. Unable to decrypt password due to error. Input length must be multiple of 8 when decrypting with padded cipher. ******************** xa_portal.log file show below error message - Unable to decrypt password due to error javax.crypto.IllegalBlockSizeException: Input length must be multiple of 8 when decrypting with padded cipher at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:922) ******************** We are using open JDK . Cipher.getMaxAllowedKeyLength is : 2147483647 We are just using Ranger with LDAP / AD ( no kerberos ) Thanks For your help in advance Naveen.
... View more
Labels:
- Labels:
-
Apache Ranger
03-02-2017
08:05 PM
Hi Do we need to run below both commands for hive -- ssl configuration . We are using LDAP / AD 1) as per the link : https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_Security_Guide/content/ch_wire-hiveserver2.html keytool -genkey -alias hbase -keyalg RSA -keysize 1024 -keystore hbase.jks and 2) as per the link : https://docs.hortonworks.com/HDPDocuments/HDP2/HDP2.3.2/bk_dataintegration/content/enabling_hs2_for_ldap_and_ldapssl.html keytool -import -trustcacerts -alias <MyHiveLdaps> -storepass <password> -noprompt -file <myCert>.pem -keystore ${JAVA_HOME}/jre/lib/security/cacerts I just did just 2nd one and blow are my 3 setting in custom-hive-site.xml hive.server2.use.SSL=true hive.server2.keystore.path=/usr/lib/jvm/java-1.8.0-openjdk/jre/lib/security/cacerts hive.server2.keystore.password=xxxxxxxx
Thanks Naveen
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger