Member since
12-20-2015
39
Posts
0
Kudos Received
0
Solutions
07-29-2019
11:07 AM
@Geoffrey Shelton Okot Help please
... View more
07-29-2019
11:06 AM
Hi All, I am trying to run a saprk job which create hive context and triggered through ooize using shell action.When i run this job standalone without oozie,it works fine.With oozie it gives below error.Any help please: 9/07/29 10:56:07 INFO hive.HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 19/07/29 10:56:07 INFO client.ClientWrapper: Inspected Hadoop version: 2.7.3.2.6.4.0-91 19/07/29 10:56:07 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.3.2.6.4.0-91 19/07/29 10:56:07 INFO client.ClientWrapper: Attempting to login to Kerberos using principal: user@HDP.COM and keytab: user.keytab-f5b06c50-7f5f-4ba4-985c-74c4c2c98f74 19/07/29 10:56:07 INFO security.UserGroupInformation: Login successful for user user@HDP.COM using keytab file user.keytab-f5b06c50-7f5f-4ba4-985c-74c4c2c98f74 19/07/29 10:56:07 INFO hive.metastore: Trying to connect to metastore with URI thrift://<masternode>:9083 19/07/29 10:56:07 ERROR transport.TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Oozie
-
Apache Spark
11-12-2018
06:40 AM
@Jay Kumar SenSharma
I am also facing the same issue..however in my case i am seeing that all packagaes are installed and yum.log is clean means no errors..
ambari=> select * from host_version;
id | repo_version_id | host_id | state
----+-----------------+---------+----------------
8 | 2 | 1 | CURRENT
9 | 2 | 5 | CURRENT
13 | 2 | 3 | CURRENT
12 | 2 | 2 | CURRENT
14 | 2 | 4 | CURRENT
11 | 2 | 7 | CURRENT
10 | 2 | 6 | CURRENT
62 | 52 | 2 | INSTALL_FAILED
63 | 52 | 3 | INSTALL_FAILED
58 | 52 | 1 | INSTALL_FAILED
64 | 52 | 4 | INSTALL_FAILED
59 | 52 | 5 | INSTALL_FAILED
61 | 52 | 7 | INSTALL_FAILED
60 | 52 | 6 | INSTALL_FAILED
(14 rows)
The new target version is showing failed..which pakacges are installed on all nodes and i cannot get to upgrade prompt.
... View more
10-24-2018
06:10 AM
Why its using LDAP?LDAP is not setup on my cluster.I am using KDC.
@JayKumarSharma
Also i have done the configuration in admin topology so i am using now admin instead of default in my URL.
[hdfs@<knox1> ~]$ curl -k -i -vvvv -u guest:guest-password "https://<knox>:8443/gateway/default/webhdfs/v1/user?=op=LISTSTATUS"
* About to connect() to <knox> port 8443 (#0)
* Trying <knoxIP>... connected
* Connected to <knox> (<knoxIP>) port 8443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* warning: ignoring value of ssl.verifyhost
* skipping SSL peer certificate verification
* SSL connection using TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
* Server certificate:
* subject: CN=<knox>,OU=Test,O=Hadoop,L=Test,ST=Test,C=US
* start date: Oct 22 16:16:52 2018 GMT
* expire date: Oct 22 16:16:52 2019 GMT
* common name: <knox>
* issuer: CN=<knox>,OU=Test,O=Hadoop,L=Test,ST=Test,C=US
* Server auth using Basic with user 'guest'
> GET /gateway/default/webhdfs/v1/user?=op=LISTSTATUS HTTP/1.1
> Authorization: Basic Z3Vlc3Q6Z3Vlc3QtcGFzc3dvcmQ=
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.27.1 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: <knox>:8443
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
HTTP/1.1 401 Unauthorized
< Date: Wed, 24 Oct 2018 06:04:23 GMT
Date: Wed, 24 Oct 2018 06:04:23 GMT
< Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Tue, 23-Oct-2018 06:04:23 GMT
Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Tue, 23-Oct-2018 06:04:23 GMT
* Authentication problem. Ignoring this.
< WWW-Authenticate: BASIC realm="application"
WWW-Authenticate: BASIC realm="application"
< Content-Length: 0
Content-Length: 0
< Server: Jetty(9.2.15.v20160210)
Server: Jetty(9.2.15.v20160210)
<
* Connection #0 to host <knox> left intact
* Closing connection #0
[hdfs@dev-p76-app-01 ~]$
<br>
... View more
10-23-2018
06:45 PM
Hi, I have kerberos and HA enabled on my hadoop cluster.Now to enable HA over the webhdfs i did the following configuration: <provider>
<role>ha</role>
<name>HaProvider</name>
<enabled>true</enabled>
<param> <name>WEBHDFS</name>
<value>maxFailoverAttempts=3;failoverSleep=1000;maxRetryAttempts=300;retrySleep=1000;enabled=true</value>
</param>
</provider> <service>
<role>WEBHDFS</role> <url>http://<nn1>:50070/webhdfs</url> <url>http://<nn2>:50070/webhdfs</url>
</service> But the curl command is still failing. I am not using SSL. Can someone point to correct curl command i should use assuming knox1 is my hostname of knox gateway. I have used below command $ curl -k -i -vvvv --negotiate -u : "http://<knox1>:50070/gateway/<cluster_name>/webhdfs/v1/user?=op=LISTSTATUS" I have followed below tutorials but they cannot help: https://community.hortonworks.com/questions/35125/knox-error-after-configuring-namenode-ha.html https://community.hortonworks.com/content/supportkb/150585/how-to-configure-a-knox-topology-for-namenode-ha-1.html
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Knox
08-11-2018
08:20 PM
I fixed the issue on RHEL6.9 by installing libtirpc and libtirpc-devel 0.15 and uninstalling libtirpc 0.13.
... View more
07-06-2018
06:30 AM
Hi, I have to upgrade my Dev,QA,Preprod,Prod HDP clusters from 2.5.3.16 to 2.6.4 one by one.All the cluster are kerberos enabled and using full stack.Can you please share a link which i can read about to have details of upgrade process. Any link where issues faced during upgrade are discussed? TIA
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
06-26-2018
01:29 PM
@Jay Kumar SenSharma My client machine(my mac) is in my company domain..so i already have a valid klist ticket of my comany domain..How can now i can do a curl from my mac to HDP cluster?
... View more
06-26-2018
01:17 PM
Hi,I have a Kerberized cluster.I want to run webhdfs/REST call from my laptop. I donot have knox as of now.How can i do that?
... View more
Labels:
- Labels:
-
Apache Hadoop
06-25-2018
12:58 PM
@Geoffrey Shelton Okot getting below error: Caused by: javax.naming.PartialResultException [Root exception is javax.naming.CommunicationException: simple bind failed: global.publicisgroupe.net:636 [Root exception is javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target]] at com.sun.jndi.ldap.AbstractLdapNamingEnumeration.hasMoreImpl(AbstractLdapNamingEnumeration.java:237) at com.sun.jndi.ldap.AbstractLdapNamingEnumeration.hasMore(AbstractLdapNamingEnumeration.java:189) at org.apache.ambari.server.serveraction.kerberos.ADKerberosOperationHandler.findPrincipalDN(ADKerberosOperationHandler.java:601) at org.apache.ambari.server.serveraction.kerberos.ADKerberosOperationHandler.principalExists(ADKerberosOperationHandler.java:233) ... 9 more
... View more
06-23-2018
03:46 PM
@Geoffrey Shelton OkotCan my KDC server and AD be same?I donot fine HDP documentation straightforward with clear instructions for enabling kerberos with AD.
... View more
06-23-2018
03:38 PM
Hi,I have to turn on the HDP security using AD first time on my cluster.Can some provide me step by step documentation and prerequisite for same what i need handy? Can my AD be my KDC?or do i need to have KDC installed within Hadoop cluster ideally with same realm as of KDC? While working on enabling Kerberos using existing AD,what will be kadmin host and KDC host? TIA @Geoffrey Shelton Okot
... View more
06-23-2018
03:31 PM
@Geoffrey Shelton Okot:Now i need to access my HDP cluster from my Laptop using curl/rest API but i am not able to do so.My laptop is in different AD domain.I tried enabling SPENGO/HTTP as well but no luck.Curl call works inside the cluster but not from outside.Any documentation help on that?
... View more
06-19-2018
05:12 AM
Hi, How we can setup spengo authentication for client when Client and HDP cluster are in different domains. My hdp cluster is running with ENV.COM.I have installed KDC server. My Mac is on corporate domain on global.company.com.When i do a klist on mac,it gives me ticket from my corporate domain controller.i cannot use keytab of cluster. Can someoen help?
... View more
06-17-2018
10:48 PM
Hi, I have a keberised cluster setup working fine.I want my end users to do all webhdfs,atlas,and all work using REST API/CURL command or Postman from their laptop. Can someone point me to steps/documention to achieve this along with few examples. Thanks Much
... View more
Labels:
- Labels:
-
Apache Atlas
-
Apache Hadoop
06-16-2018
11:14 PM
why to install a broker?just yum install kafka and rest of the configuration will be managed by command line/customer conf files...you donot need any configuration for client
... View more
06-14-2018
06:19 PM
I have kerberised the hdp cluster.All the components and everything working fine except kafka.i observed that i am able to run kafka without any token and kinit.while for other compenents like hdfs,hive,hbase,spark i have to do kinit.why is such a issue?Any reasons?I only have 1 kafka broker in my cluster.
... View more
Labels:
- Labels:
-
Apache Kafka
05-25-2018
01:36 PM
Seeing below exception in Kafka Logs after enabling Kafka: p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff}
span.s1 {font-variant-ligatures: no-common-ligatures}
span.s2 {font-variant-ligatures: no-common-ligatures; background-color: #00e6e6} [2018-05-25 18:57:02,370] INFO Connecting to zookeeper on <hots1>:2181,<host2>:2181,<host3>:2181 (kafka.server.KafkaServer) [2018-05-25 18:57:02,596] FATAL Fatal error during KafkaServer startup. Prepare to shutdown (kafka.server.KafkaServer) org.I0Itec.zkclient.exception.ZkAuthFailedException: Authentication failure at org.I0Itec.zkclient.ZkClient.waitForKeeperState(ZkClient.java:946) at org.I0Itec.zkclient.ZkClient.waitUntilConnected(ZkClient.java:923) at org.I0Itec.zkclient.ZkClient.connect(ZkClient.java:1230) at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:156) at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:130) at kafka.utils.ZkUtils$.createZkClientAndConnection(ZkUtils.scala:75) at kafka.utils.ZkUtils$.apply(ZkUtils.scala:57) at kafka.server.KafkaServer.initZk(KafkaServer.scala:294) at kafka.server.KafkaServer.startup(KafkaServer.scala:180) at kafka.server.KafkaServerStartable.startup(KafkaServerStartable.scala:37) at kafka.Kafka$.main(Kafka.scala:67) at kafka.Kafka.main(Kafka.scala) [2018-05-25 18:57:02,607] INFO shutting down (kafka.server.KafkaServer) [2018-05-25 18:57:02,618] INFO shut down completed (kafka.server.KafkaServer) [2018-05-25 18:57:02,619] FATAL Fatal error during KafkaServerStartable startup. Prepare to shutdown (kafka.server.KafkaServerStartable) org.I0Itec.zkclient.exception.ZkAuthFailedException: Authentication failure at org.I0Itec.zkclient.ZkClient.waitForKeeperState(ZkClient.java:946) at org.I0Itec.zkclient.ZkClient.waitUntilConnected(ZkClient.java:923) at org.I0Itec.zkclient.ZkClient.connect(ZkClient.java:1230) at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:156) at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:130) at kafka.utils.ZkUtils$.createZkClientAndConnection(ZkUtils.scala:75) at kafka.utils.ZkUtils$.apply(ZkUtils.scala:57) at kafka.server.KafkaServer.initZk(KafkaServer.scala:294) at kafka.server.KafkaServer.startup(KafkaServer.scala:180) at kafka.server.KafkaServerStartable.startup(KafkaServerStartable.scala:37) at kafka.Kafka$.main(Kafka.scala:67) at kafka.Kafka.main(Kafka.scala) [2018-05-25 18:57:02,634] INFO shutting down (kafka.server.KafkaServer)
... View more
Labels:
- Labels:
-
Apache Kafka
05-20-2018
02:12 AM
Any step by step link/documentation for enabling SSL on HDP across all components...kafka,spark,hive,hbase,hdfs,MR,YARN etc.
... View more
05-10-2018
12:45 PM
I have setup my kerberised dev cluster.I now need to add 5 users to my cluster who can can access hdfs,hive,hbase,submit spark jobs.I have MIT KDC installed on one of the node.Any link or documentation will be helpful.TIA.
... View more
05-10-2018
12:40 PM
Thanks @Geoffrey Shelton Okot . This is done. How can i validate hive and hbase as well?
... View more
05-10-2018
11:47 AM
@Geoffrey Shelton @Sandeep Nemuri GUys,thanks a lot.I am done successfully.Can you share few steps to verify steps for services like hdfs,spark,yarn,hive,hbase!
... View more
05-10-2018
09:54 AM
@Geoffrey Shelton Okot @Sandeep Nemuri I donot see kerberos wizard on my ambari?whats the issue?I have reached till the steps provided by @Geoffrey Shelton Okot till installiing JCE files and restrarting ambari server after that.
... View more
05-10-2018
08:57 AM
@Geoffrey Shelton Okot Thanks!So we need to have MIT KDC or AD running.Right? For the development environment,is there a way to setup MIT KDC specifically for development environment?Any link for that please?
... View more
05-06-2018
04:40 PM
Hi, I have HDP installed on my cluster.Now next task is to enable kerberos for HDP cluster .Can some one point me to step by step documentation please?Do i need AD/LDAP as well.Its a development cluster .
... View more
04-12-2018
11:29 AM
Thanks Prashant!
... View more