Member since
06-21-2016
25
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
10757 | 03-27-2019 08:07 AM |
03-27-2019
08:07 AM
Hi community, I've fixed the issue by adding bellow Kerberos host principal to file /etc/krb5.keytab: host/fqdn_hostname@REALM. The one that was previously set did not my match my environment configuration: host/UNKNOWN_DOMAIN@UNKNOWN_REALM
... View more
03-12-2019
04:01 PM
Hi guys, I found an environment where ksu works. My issue seems to be related to some sssd configuration but still did not ended to solve this issue. Does it remind you of something regarding sssd configuration ? Thank you.
... View more
02-21-2019
08:27 AM
Thanks for your reply but still getting the issue with your settings.
... View more
02-20-2019
05:16 PM
Can you please be more precise on how to change that file ?
... View more
02-20-2019
05:16 PM
Here is my krb5.conf - for security purposes, I do not provide my environment real values but be sure that it matches EXAMPLE.COM and UNKNOWN_DOMAIN. includedir /etc/krb5.conf.d/ includedir /var/lib/sss/pubconf/krb5.include.d/ [libdefaults] default_realm = EXAMPLE.COM dns_lookup_realm = true dns_lookup_kdc = true rdns = false ticket_lifetime = 24h renew_lifetime = 7d forwardable = true udp_preference_limit = 0 default_ccache_name = /tmp/krb5cc_%{uid} [logging] default = FILE:/var/log/krb5kdc.log admin_server = FILE:/var/log/kadmind.log kdc = FILE:/var/log/krb5kdc.log [realms] UNKNOWN_DOMAIN = { pkinit_anchors = FILE:/etc/ipa/ca.crt } EXAMPLE.COM = { admin_server = myadmin.server.com kdc = myadmin.server.com } [domain_realm] .unknown_domain = UNKNOWN_DOMAIN unknown_domain = UNKNOWN_DOMAIN
... View more
02-20-2019
05:16 PM
Hi community, I am studying ksu for some use cases and found this link: https://web.mit.edu/kerberos/krb5-1.5/krb5-1.5.4/doc/krb5-user/ksu.html I have a user1 with KDC entry and keytab. Just before running ksu, I kinit user1 to get Kerberos ticket: [user1@server1 ~]$ klist Ticket cache: FILE:/tmp/krb5cc_1003293697 Default principal: user1@EXAMPLE.COM Valid starting Expires Service principal 02/18/2019 09:13:12 02/19/2019 09:13:12 krbtgt/EXAMPLE.COM@EXAMPLE.COM Then, I want user1 to ksu user2. For this to work, I have created a .k5login file on user2 home directory with user1@EXAMPLE.COM on its content. Than, I launch ksu with user1 but found this issue: [user1@server1 ~]$ ksu user2 ksu: Server not found in Kerberos database while verifying ticket for server Authentication failed. Looking for an error on /var/log/krb5kdc.log, I found that one: UNKNOWN_SERVER: authtime 0, user1@EXAMPLE.COM for krbtgt/UNKNOWN_DOMAIN@EXAMPLE.COM, Server not found in Kerberos database As the error states, service principal name krbtgt/UNKNOWN_DOMAIN@EXAMPLE.COM is unknown to KDC database, which is right. The problem is I expected the SPN to be krbtgt/EXAMPLE.COM@EXAMPLE.COM, just like what I can see on my user1 klist. As I don't really know how to fix this, does someone have an idea on this, please ? On different website and forums, it talks about FQDN, reverse DNS and some /etc/hosts and /etc/resolv.conf configurations but none solved my issue. Thank you on advance for your help.
... View more
Labels:
- Labels:
-
Apache Hadoop
10-01-2018
12:04 PM
Hi Vrathod, Same error with Firefox 62.0.2 (64 bits) and SPNEGI enabled for my domain. NB: my localhost computer that tries to access Druid UI is Windows. May I have to install Kerberos and follow 1 to 3 instructions as well ? Regards
... View more
09-28-2018
10:00 AM
Hi, I freshly installed Druid version 0.10.1 through Ambari on a kerberized cluster on HDP 2.6.4. Everything runs well but when I try to access Druid Coordinator and Overlord consoles with Ambari Quick Links, I get an error 403: Problem accessing /. Reason:
org.apache.hadoop.security.authentication.client.AuthenticationException: GSSException: Defective token detected (Mechanism level: GSSHeader did not find the right tag) I guess this is related with Kerberos but still don't know the root cause. Thank you for your help. Regards
... View more
Labels:
09-13-2018
03:39 PM
Hi @Saby SS, Have you tried this ? https://community.hortonworks.com/content/supportkb/191924/how-to-improve-performance-of-knox-request-per-sec.html
... View more
09-13-2018
03:22 PM
Hi @Ashokkumar R #2: here is documentation based on HDP 2.6.5: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_security/content/configure_knox_for_ha.html Regards.
... View more
06-15-2018
02:33 PM
@Felix Albani is their a relation between the WebHcat server and Hive Server or Know server ? I still don't understand how these components are related to each other ? Thank you.
... View more
06-08-2018
03:41 PM
@Hernán Fernández same thing. Here is the command I typed: curl -ivk --negotiate 'https://my_knox_hostname:9443/gateway/default/hive/?op=LISTSTATUS'
... View more
06-07-2018
03:08 PM
@Felix Albani 1 - I confirm these settings; just to let you know that hadoop.proxyuser.knox.hosts is not set as star "*" but with some hostnames. But I confirm my Knox hostname is there. 2 - For a personal purpose, I actually don't manage hive-site content with Ambari as I am using a particular software that have its own hive-site.xml. As I can see, webhcat settings are not written into this file so I've asked the software editor how I can handle this. 3 - How can I activate debug for HS2, please ? 4 - Can you please tell me more on how tu use this ? When I type this command on my Knox hostname, I have: $ tcpdump -A port 10011
tcpdump: WARNING: SIOCGIFADDR: nflog: No such device
tcpdump: NFLOG link-layer type filtering not implemented
Thank you.
... View more
06-07-2018
02:49 PM
@Hernán Fernández Here is result of bellow command: curl -ivk --negotiate -u myuser:mypasswd 'https://my_knox_hostname:9443/gateway/default/hive/?op=LISTSTATUS' * About to connect() to my_knox_hostname port 9443 (#0)
* Trying XXX.XXX.XXX.XXX...
* Connected to my_knox_hostname (XXX.XXX.XXX.XXX) port 9443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* skipping SSL peer certificate verification
* SSL connection using TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
* Server certificate:
*
*
*
> GET /gateway/default/hive/?op=LISTSTATUS HTTP/1.1
> User-Agent: curl/7.29.0
> Host: my_knox_hostname:9443
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
HTTP/1.1 401 Unauthorized
< Date: Thu, 07 Jun 2018 14:41:27 GMT
Date: Thu, 07 Jun 2018 14:41:27 GMT
< WWW-Authenticate: BASIC realm="application"
WWW-Authenticate: BASIC realm="application"
< Content-Length: 0
Content-Length: 0
< Server: Jetty(9.2.15.v20160210)
Server: Jetty(9.2.15.v20160210)
<
* Connection #0 to host my_knox_hostname left intact
... View more
06-05-2018
12:56 PM
@Hernán Fernández let me try this and I will come back to you. Thank you.
... View more
06-05-2018
12:52 PM
@Felix Albani I don't have any logs on my hive server. My guess is that connection passes Knox gateway but don't go through Hive that asks some credentials that the POST statement don't deliver. Here are my Knox gateway logs when trying to reach my Hive server with mentionned beeline: 2018-05-24 10:32:51,330 DEBUG hadoop.gateway (GatewayFilter.java:doFilter(116)) - Received request: POST /hive
2018-05-24 10:32:51,382 DEBUG hadoop.gateway (KnoxLdapRealm.java:getUserDn(718)) - Searching from dc=domain,dc=realm where (&(objectclass=posixAccount)(uid=myuser)) scope subtree
2018-05-24 10:32:51,387 INFO hadoop.gateway (KnoxLdapRealm.java:getUserDn(724)) - Computed userDn: uid=myuser,ou=Users,dc=domain,dc=realm using ldapSearch for principal: myuser
2018-05-24 10:32:51,441 INFO hadoop.gateway (AclsAuthorizationFilter.java:doFilter(85)) - Access Granted: true
2018-05-24 10:32:51,442 DEBUG hadoop.gateway (UrlRewriteProcessor.java:rewrite(164)) - Rewrote URL: https://my_knox_hostname:9443/gateway/default/hive, direction: IN via implicit rule: HIVE/hive/inbound to URL: http://my_http_hive_hostname:10011/cliservice
2018-05-24 10:32:51,443 DEBUG hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(121)) - Dispatch request: POST http://my_http_hive_hostname:10011/cliservice?doAs=myuser
2018-05-24 10:32:51,461 DEBUG hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(134)) - Dispatch response status: 401
2018-05-24 10:32:51,461 DEBUG hadoop.gateway (DefaultDispatch.java:getInboundResponseContentType(209)) - Inbound response entity content type not provided.
2018-05-24 10:32:51,471 DEBUG hadoop.gateway (GatewayFilter.java:doFilter(116)) - Received request: POST /hive
2018-05-24 10:32:51,472 INFO hadoop.gateway (AclsAuthorizationFilter.java:doFilter(85)) - Access Granted: true
2018-05-24 10:32:51,472 DEBUG hadoop.gateway (UrlRewriteProcessor.java:rewrite(164)) - Rewrote URL: https://my_knox_hostname:9443/gateway/default/hive, direction: IN via implicit rule: HIVE/hive/inbound to URL: http://my_http_hive_hostname:10011/cliservice
2018-05-24 10:32:51,473 DEBUG hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(121)) - Dispatch request: POST http://my_http_hive_hostname:10011/cliservice?doAs=myuser
2018-05-24 10:32:51,488 DEBUG hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(134)) - Dispatch response status: 401
2018-05-24 10:32:51,488 DEBUG hadoop.gateway (DefaultDispatch.java:getInboundResponseContentType(209)) - Inbound response entity content type not provided.
2018-05-24 10:32:51,524 DEBUG hadoop.gateway (GatewayFilter.java:doFilter(116)) - Received request: POST /hive
2018-05-24 10:32:51,575 DEBUG hadoop.gateway (KnoxLdapRealm.java:getUserDn(718)) - Searching from dc=domain,dc=realm where (&(objectclass=posixAccount)(uid=myuser)) scope subtree
2018-05-24 10:32:51,579 INFO hadoop.gateway (KnoxLdapRealm.java:getUserDn(724)) - Computed userDn: uid=myuser,ou=Users,dc=domain,dc=realm using ldapSearch for principal: myuser
2018-05-24 10:32:51,631 INFO hadoop.gateway (AclsAuthorizationFilter.java:doFilter(85)) - Access Granted: true
2018-05-24 10:32:51,632 DEBUG hadoop.gateway (UrlRewriteProcessor.java:rewrite(164)) - Rewrote URL: https://my_knox_hostname:9443/gateway/default/hive, direction: IN via implicit rule: HIVE/hive/inbound to URL: http://my_http_hive_hostname:10011/cliservice
2018-05-24 10:32:51,633 DEBUG hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(121)) - Dispatch request: POST http://my_http_hive_hostname:10011/cliservice?doAs=myuser
2018-05-24 10:32:51,648 DEBUG hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(134)) - Dispatch response status: 401
2018-05-24 10:32:51,649 DEBUG hadoop.gateway (DefaultDispatch.java:getInboundResponseContentType(209)) - Inbound response entity content type not provided.
2018-05-24 10:32:51,658 DEBUG hadoop.gateway (GatewayFilter.java:doFilter(116)) - Received request: POST /hive
2018-05-24 10:32:51,659 INFO hadoop.gateway (AclsAuthorizationFilter.java:doFilter(85)) - Access Granted: true
2018-05-24 10:32:51,660 DEBUG hadoop.gateway (UrlRewriteProcessor.java:rewrite(164)) - Rewrote URL: https://my_knox_hostname:9443/gateway/default/hive, direction: IN via implicit rule: HIVE/hive/inbound to URL: http://my_http_hive_hostname:10011/cliservice
2018-05-24 10:32:51,661 DEBUG hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(121)) - Dispatch request: POST http://my_http_hive_hostname:10011/cliservice?doAs=myuser
2018-05-24 10:32:51,678 DEBUG hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(134)) - Dispatch response status: 401
2018-05-24 10:32:51,678 DEBUG hadoop.gateway (DefaultDispatch.java:getInboundResponseContentType(209)) - Inbound response entity content type not provided.
<br>
... View more
06-05-2018
10:00 AM
Thank you Felix Albani for your answer. I forgot to mention that I get 401 reponse error using a similar beeline: !connect jdbc:hive2://my_knox_hostname:9443/;ssl=true;transportMode=http;httpPath=gateway/default/hive;sslTrustStore=/etc/pki/ca-trust/extracted/java/cacerts;trustStorePassword=trust_passwd
... View more
06-04-2018
02:04 PM
Hi everyone, I am facing an issue that bruns my brain for a couple of days; hope you will help me managing this. I have a Hive Server 2 running in HTTP mode with Kerberos and I can connect well using beeline from another server of my cluster using bellow JDBC URi (after getting a Kerberos ticket with kinit): jdbc:hive2://my_hive_server:10011/;principal=myprincipal/hostname@domain;transportMode=http;httpPath=cliservice The problem comes when I try to connect to this Hive Server through Knox with LDAP user credentials; it gives me response 401 error. I have tried many configurations found on this community site and googling, but without success. Same issue occurs using curl command: curl -iv -k -u myuser:mypasswd -X GET 'https://my_knox_hostname:9443/gateway/default/hive/?op=LISTSTATUS' * Server auth using Basic with user 'myuser' > GET /gateway/default/hive/?op=LISTSTATUS HTTP/1.1 > Authorization: Basic WDExMTExNTpoYWhhaGE= > User-Agent: curl/7.29.0
> Host: my_knox_hostname:9443 > Accept: */*
> < HTTP/1.1 401 Unauthorized HTTP/1.1 401 Unauthorized < Date: Mon, 04 Jun 2018 07:46:45 GMT
Date: Mon, 04 Jun 2018 07:46:45 GMT < Set-Cookie: JSESSIONID=5v2868pq8l6m1mc3lt5u6l156;Path=/gateway/default;Secure;HttpOnly Set-Cookie: JSESSIONID=5v2868pq8l6m1mc3lt5u6l156;Path=/gateway/default;Secure;HttpOnly < Expires: Thu, 01 Jan 1970 00:00:00 GMT Expires: Thu, 01 Jan 1970 00:00:00 GMT < Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Sun, 03-Jun-2018 07:46:45 GMT Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Sun, 03-Jun-2018 07:46:45 GMT < Server: Jetty(7.6.0.v20120127) Server: Jetty(7.6.0.v20120127) < Content-Length: 69 Content-Length: 69 < Authentication Error: java.lang.reflect.UndeclaredThrowableException * Connection #0 to host my_knox_hostname left intact Some help would be appreciated; thnak you in advance. Regards.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Knox
05-31-2018
07:57 AM
Hi Shota, Have you fixed your problem ? I am currently facing same issue. Thx.
... View more
05-23-2018
12:45 PM
Hi everyone, I am facing same issue connecting to Hive HTTP through Knox. On Knox gateway logs, I have: access|uri|/gateway/default/hive/?op=LISTSTATUS|success|Response status: 401 @mliem: have you fixed your problem and if so, can you please tell me how ? FYI, connecting directly to Hive HTTP with beeline and same crendentials works fine. Regards.
... View more
06-21-2016
06:51 PM
@Constantin Stanca Thank you for your quick answer. As the server I work on is not planned to migrate to RHEL 6.x, can you confirm I can use the rpm version you provided for CentOS 5.x? I guess it is not optimal but is there some known issues using this version with 1.x Hive database? Thank you.
... View more
06-21-2016
03:03 PM
Hi, I am looking for Hortonworks ODBC drivers for Hive for RHEL 5.x version, but do not seem to find them. Version 2.1.2 is only for RHEL 6.x. and don't install and old versions are only for Centos 5.x. Can you please help on finding such drivers, please ? Thank you.
... View more
Labels:
- Labels:
-
Apache Hive