Member since
07-11-2017
6
Posts
0
Kudos Received
0
Solutions
08-26-2018
07:27 AM
Hi @Shivam Dwivedi, After your suggestion, i above problem got resolved. Even i added the windows AD hostname (not the aws internal private DNS) to /etc/hosts in my cluster. Thanks
... View more
08-25-2018
07:51 AM
Hello All, I have been trying to enable kerberos with AD in Ambari. Imported AD CA certificate into ambari truststore and restarted the ambari server. But it turned to be error which shows as follows. Error1: Failed to communicate with the Active Directory at ldaps://<IP_ADDR>:636: <IP_ADDR>:636 javax.naming.CommunicationException: <IP_ADDR>:636 [Root exception is javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No subject alternative DNS name matching ip-172-31-43-242.us-west-2.compute.internal found.]
at com.sun.jndi.ldap.Connection.<init>(Connection.java:238)
at com.sun.jndi.ldap.LdapClient.<init>(LdapClient.java:137)
at com.sun.jndi.ldap.LdapClient.getInstance(LdapClient.java:1615)
at com.sun.jndi.ldap.LdapCtx.connect(LdapCtx.java:2749)
at com.sun.jndi.ldap.LdapCtx.<init>(LdapCtx.java:319)
at com.sun.jndi.ldap.LdapCtxFactory.getUsingURL(LdapCtxFactory.java:192)
at com.sun.jndi.ldap.LdapCtxFactory.getUsingURLs(LdapCtxFactory.java:210)
at com.sun.jndi.ldap.LdapCtxFactory.getLdapCtxInstance(LdapCtxFactory.java:153)
at com.sun.jndi.ldap.LdapCtxFactory.getInitialContext(LdapCtxFactory.java:83)
at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:684)
at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:313)
at javax.naming.InitialContext.init(InitialContext.java:244)
at javax.naming.ldap.InitialLdapContext.<init>(InitialLdapContext.java:154)
at org.apache.ambari.server.serveraction.kerberos.ADKerberosOperationHandler.createInitialLdapContext(ADKerberosOperationHandler.java:514)
at org.apache.ambari.server.serveraction.kerberos.ADKerberosOperationHandler.createLdapContext(ADKerberosOperationHandler.java:465)
at org.apache.ambari.server.serveraction.kerberos.ADKerberosOperationHandler.open(ADKerberosOperationHandler.java:182)
at org.apache.ambari.server.controller.KerberosHelperImpl.validateKDCCredentials(KerberosHelperImpl.java:1901)
at org.apache.ambari.server.controller.KerberosHelperImpl.handleTestIdentity(KerberosHelperImpl.java:2230)
at org.apache.ambari.server.controller.KerberosHelperImpl.createTestIdentity(KerberosHelperImpl.java:1029)
at org.apache.ambari.server.controller.AmbariManagementControllerImpl.createAction(AmbariManagementControllerImpl.java:4249)
at org.apache.ambari.server.controller.internal.RequestResourceProvider$1.invoke(RequestResourceProvider.java:264)
at org.apache.ambari.server.controller.internal.RequestResourceProvider$1.invoke(RequestResourceProvider.java:193)
at org.apache.ambari.server.controller.internal.AbstractResourceProvider.invokeWithRetry(AbstractResourceP. . Caused by: javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No subject alternative DNS name matching <IP_ADDR>found.
at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1964)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:328)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:322)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1614)
at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1052)
at sun.security.ssl.Handshaker.process_record(Handshaker.java:987)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1072)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1385)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
at com.sun.jndi.ldap.Connection.createSocket(Connection.java:394)
at com.sun.jndi.ldap.Connection.<init>(Connection.java:215)
... 116 more
Caused by: java.security.cert.CertificateException: No subject alternative DNS name matching ip-172-31-43-242.us-west-2.compute.internal found.
at sun.security.util.HostnameChecker.matchDNS(HostnameChecker.java:214)
at sun.security.util.HostnameChecker.match(HostnameChecker.java:96)
at sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:459)
at sun.security.ssl.X509TrustManagerImpl.checkIdentity(X509TrustManagerImpl.java:436)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:200)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1596) ERROR [ambari-client-thread-42] KerberosHelperImpl:2232 - Cannot validate credentials: org.apache.ambari.server.serveraction.kerberos.KerberosInvalidConfigurationException: Failed to connect to KDC - Failed to communicate with the Active Directory at ldaps://<IP_ADDR>:636: <IP_ADDR>:636
Make sure the server's SSL certificate or CA certificates have been imported into Ambari's truststore.
25 Aug 2018 07:40:41,298 ERROR [ambari-client-thread-42] BaseManagementHandler:67 - Bad request received: Failed to connect to KDC - Failed to communicate with the Active Directory at ldaps://<IP_ADDR>:636: <IP_ADDR>:636 Even i tried to test the connection from ambari host to AD using following: LDAPTLS_CACERT=ad1.cer ldapsearch -H ldaps://<IP_ADDR>:636 -D "hdfadmin@example.com" -b "OU=hdfedge,DC=Hadoop,DC=Internal" "(&(objectclass=person)(sAMAccountName=*))" but it is giving the below error. ldap_sasl_bind(SIMPLE): Can't contact LDAP server (-1) If anyone faced the same problem please correct me the right direction.
... View more
Labels:
10-16-2017
01:29 AM
We are not able to connect to SAP systems after enabling the kerberos. I am really don't understand how the kerberos and SSL works in hadoop cluster. If anyone faced same challenge please let me the resolution. After enabling the kerberos please help me to understand how the SSL and kerberos works ? how to connect with other systems ?
... View more
Labels:
07-03-2017
11:41 AM
<---------------job.properties-------------------------> nameNode=hdfs://sample:8020 jobTracker=http://sample:8050 classpath
oozie.use.system.libpath=true oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20170421164446 oozie.action.sharelib.for.hive=hive,hcatalog,sqoop,pig ###### Workflow ##################
oozie.wf.application.path=${nameNode}/user/${user.name}/apps/oozie/oozie_job_sample/workflow/wf_sample.xml
####### Oozie Job Config: ################ property_file=${nameNode}/user/${user.name}/apps/oozie/oozie_job_sample/property_files hive_site_xml=${nameNode}/user/${user.name}/apps/oozie/oozie_job_sample/workflow/conf ##Hcat-hive action credentials hcat_metastore_uri=thrift://server:9083 hcat_metastore_principal=hive/server@host.LAN <--------------------workflow----------------------> <?xml version="1.0" encoding="UTF-8"?> <workflow-app xmlns="uri:oozie:workflow:0.5" name="wf_cca_sample"> <global> <job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node> <job-xml>${hive_site_xml}/hive-site.xml</job-xml> <configuration> <property> <name>oozie.hive.defaults</name> <value>${hive_site_xml}/hive-site.xml</value> </property> <property>
<name>hive.execution.engine</name> <value>tez</value>
</property> </configuration>
</global> <credentials> <credential name="hcat-creds" type="hcat">
<property>
<name>hcat.metastore.uri</name>
<value>${hcat_metastore_uri}</value>
</property> <property>
<name>hcat.metastore.principal</name>
<value>${hcat_metastore_principal}</value>
</property>
</credential>
</credentials> <start to="test_sample"/>
<action name="test_sample" cred="hcat-creds">
<hive xmlns="uri:oozie:hive2-action:0.4">
<job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node>
<job-xml>${hive_site_xml}/hive-site.xml</job-xml>
<script>test_sample.sql</script> </hive>
<ok to="end"/>
<error to="fail_email"/>
</action> <kill name="fail_email">
<message>Daily Oozie workflow job for "sample_job" is failed , error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> <---------------------error------------------------------------> oozi-W] ACTION[0000004-170703113843812-oozie-oozi-W@test_sample] Launcher exception: java.lang.NullPointerException
java.lang.RuntimeException: java.lang.NullPointerException
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:560)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:336)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:313)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:58)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:239)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.lang.NullPointerException
at org.apache.hadoop.crypto.key.KeyProviderExtension.<init>(KeyProviderExtension.java:43)
at org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.<init>(KeyProviderDelegationTokenExtension.java:97)
at org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.createKeyProviderDelegationTokenExtension(KeyProviderDelegationTokenExtension.java:134)
at org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2402)
at org.apache.tez.common.security.TokenCache.obtainTokensForFileSystemsInternal(TokenCache.java:119)
at org.apache.tez.common.security.TokenCache.obtainTokensForFileSystemsInternal(TokenCache.java:98)
at org.apache.tez.common.security.TokenCache.obtainTokensForFileSystems(TokenCache.java:76)
at org.apache.tez.client.TezClientUtils.setupTezJarsLocalResources(TezClientUtils.java:198)
at org.apache.tez.client.TezClient.getTezJarResources(TezClient.java:831)
at org.apache.tez.client.TezClient.start(TezClient.java:355)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:197)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:116)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:557)
... 19 more
... View more
Labels:
12-05-2016
06:40 AM
Yes, it is partitioned I tried your suggestion to repair the table by msck repair table <table_name>; but after doing that even the no of records are more than actual records. can you please help me here. Thanks in advance
... View more