Member since
12-20-2015
39
Posts
0
Kudos Received
0
Solutions
11-12-2018
06:40 AM
@Jay Kumar SenSharma
I am also facing the same issue..however in my case i am seeing that all packagaes are installed and yum.log is clean means no errors..
ambari=> select * from host_version;
id | repo_version_id | host_id | state
----+-----------------+---------+----------------
8 | 2 | 1 | CURRENT
9 | 2 | 5 | CURRENT
13 | 2 | 3 | CURRENT
12 | 2 | 2 | CURRENT
14 | 2 | 4 | CURRENT
11 | 2 | 7 | CURRENT
10 | 2 | 6 | CURRENT
62 | 52 | 2 | INSTALL_FAILED
63 | 52 | 3 | INSTALL_FAILED
58 | 52 | 1 | INSTALL_FAILED
64 | 52 | 4 | INSTALL_FAILED
59 | 52 | 5 | INSTALL_FAILED
61 | 52 | 7 | INSTALL_FAILED
60 | 52 | 6 | INSTALL_FAILED
(14 rows)
The new target version is showing failed..which pakacges are installed on all nodes and i cannot get to upgrade prompt.
... View more
10-24-2018
06:10 AM
Why its using LDAP?LDAP is not setup on my cluster.I am using KDC.
@JayKumarSharma
Also i have done the configuration in admin topology so i am using now admin instead of default in my URL.
[hdfs@<knox1> ~]$ curl -k -i -vvvv -u guest:guest-password "https://<knox>:8443/gateway/default/webhdfs/v1/user?=op=LISTSTATUS"
* About to connect() to <knox> port 8443 (#0)
* Trying <knoxIP>... connected
* Connected to <knox> (<knoxIP>) port 8443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* warning: ignoring value of ssl.verifyhost
* skipping SSL peer certificate verification
* SSL connection using TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
* Server certificate:
* subject: CN=<knox>,OU=Test,O=Hadoop,L=Test,ST=Test,C=US
* start date: Oct 22 16:16:52 2018 GMT
* expire date: Oct 22 16:16:52 2019 GMT
* common name: <knox>
* issuer: CN=<knox>,OU=Test,O=Hadoop,L=Test,ST=Test,C=US
* Server auth using Basic with user 'guest'
> GET /gateway/default/webhdfs/v1/user?=op=LISTSTATUS HTTP/1.1
> Authorization: Basic Z3Vlc3Q6Z3Vlc3QtcGFzc3dvcmQ=
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.27.1 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: <knox>:8443
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
HTTP/1.1 401 Unauthorized
< Date: Wed, 24 Oct 2018 06:04:23 GMT
Date: Wed, 24 Oct 2018 06:04:23 GMT
< Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Tue, 23-Oct-2018 06:04:23 GMT
Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Tue, 23-Oct-2018 06:04:23 GMT
* Authentication problem. Ignoring this.
< WWW-Authenticate: BASIC realm="application"
WWW-Authenticate: BASIC realm="application"
< Content-Length: 0
Content-Length: 0
< Server: Jetty(9.2.15.v20160210)
Server: Jetty(9.2.15.v20160210)
<
* Connection #0 to host <knox> left intact
* Closing connection #0
[hdfs@dev-p76-app-01 ~]$
<br>
... View more
10-23-2018
06:45 PM
Hi, I have kerberos and HA enabled on my hadoop cluster.Now to enable HA over the webhdfs i did the following configuration: <provider>
<role>ha</role>
<name>HaProvider</name>
<enabled>true</enabled>
<param> <name>WEBHDFS</name>
<value>maxFailoverAttempts=3;failoverSleep=1000;maxRetryAttempts=300;retrySleep=1000;enabled=true</value>
</param>
</provider> <service>
<role>WEBHDFS</role> <url>http://<nn1>:50070/webhdfs</url> <url>http://<nn2>:50070/webhdfs</url>
</service> But the curl command is still failing. I am not using SSL. Can someone point to correct curl command i should use assuming knox1 is my hostname of knox gateway. I have used below command $ curl -k -i -vvvv --negotiate -u : "http://<knox1>:50070/gateway/<cluster_name>/webhdfs/v1/user?=op=LISTSTATUS" I have followed below tutorials but they cannot help: https://community.hortonworks.com/questions/35125/knox-error-after-configuring-namenode-ha.html https://community.hortonworks.com/content/supportkb/150585/how-to-configure-a-knox-topology-for-namenode-ha-1.html
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Knox
08-11-2018
08:20 PM
I fixed the issue on RHEL6.9 by installing libtirpc and libtirpc-devel 0.15 and uninstalling libtirpc 0.13.
... View more
06-26-2018
01:17 PM
Hi,I have a Kerberized cluster.I want to run webhdfs/REST call from my laptop. I donot have knox as of now.How can i do that?
... View more
Labels:
- Labels:
-
Apache Hadoop
06-23-2018
03:46 PM
@Geoffrey Shelton OkotCan my KDC server and AD be same?I donot fine HDP documentation straightforward with clear instructions for enabling kerberos with AD.
... View more
06-23-2018
03:31 PM
@Geoffrey Shelton Okot:Now i need to access my HDP cluster from my Laptop using curl/rest API but i am not able to do so.My laptop is in different AD domain.I tried enabling SPENGO/HTTP as well but no luck.Curl call works inside the cluster but not from outside.Any documentation help on that?
... View more
06-16-2018
11:14 PM
why to install a broker?just yum install kafka and rest of the configuration will be managed by command line/customer conf files...you donot need any configuration for client
... View more