Member since
03-04-2016
165
Posts
35
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
876 | 06-20-2017 03:08 PM | |
5115 | 05-11-2017 09:59 AM | |
6685 | 01-12-2017 01:50 PM | |
638 | 10-26-2016 03:02 PM | |
2112 | 09-06-2016 07:40 AM |
01-29-2018
10:24 AM
Hi I have the same problem, did you resolve that?
... View more
01-24-2018
10:01 AM
Hi, I have a problem with Atlas Taxonomy. Any operation on taxonomies are taking a long time to run (both using WebUI and API). I am using HBase + Solr Cloud. Simple GET Atlas Taxonomy takes 50 seconds. There are only some INFOs (no ERROR or WARN) in application.log, solr.log and kafka logs. Operations on tags are working good. Ranger plugin is disabled for Atlas and Kafka. Cluster is secured (kerberos). Atlas is configured to authenticate via AD, but I am working as atlas user (generated kerb ticket for atlas user for API). HBase table 'atlas_titan' has 380k records. I am using HDP 2.6.3. with Ambari 2.6.0. I restarted HBase, ZooKeeper, Sqoop, Oozie, Ambari Infra and Kafka as I found it as a solution on this community.
... View more
Labels:
- Labels:
-
Apache Atlas
09-12-2017
02:39 PM
@Leonid Fedotov Have you disabled scheduler somehow? The problem is that any user can run scheduler as a user running Zeppelin. I am afraid that the only option is interference into the source code.
... View more
09-07-2017
06:30 AM
@mqureshi
Thank you for answer. That is correct. The strange thing is that my Region server cannot reach the AD server (ping ad-server.com - timeout; and WARN Failed to get groups for user), but "hdfs groups <ad_user_name>" command returns groups correctly and I believe that group policies (I am using Ranger) are working correctly. So if I disable LDAPGroupMapping I will not be able to grant/revoke access per group, only per user.
... View more
09-05-2017
02:45 PM
Hi! My configuration: HDP 2.5.1, Ambari 2.4.2, 7 nodes (2x Master, 5x Slaves), Centos 6, Secured (Kerberos) I have configured LDAP group mapping. When I tried to "scan" HBase table I got "timeout error". From the logs: "WARN LdapGroupsMapping: Failed to get groups for user <my_active_directory_user> (retry=0)" After 3 retries I got the "timeout error" in HBase shell. Every try takes 60 seconds by default which is 180 seconds in total, while HBase timeout was set to 60 seconds I believe. I changed the HBase timeout to 240 seconds, and now every "scan" operation is done successfully after 180 seconds (LDAP groups still can not be mapped, that is just a WARN causes HBase timeout). I know I can change group mapping timeout to lets say 2 seconds, then I will obtain my result in 6 seconds, but that is not a good solution. The cause of the problem is that my RegionServers (for secure reason) are configured to not reach Active Directory. Do you have any ideas to workaround this? Except changing the network layer.
... View more
Labels:
- Labels:
-
Apache HBase
07-14-2017
12:32 PM
@Daniel Kozlowski Still not working. I workaround this by setting local system authentication (which is SSSD). Now I can log in using just username without domain Thanks
... View more
07-13-2017
05:37 AM
@Daniel Kozlowski Thank you for answer. Exactly, I am logging in as user1@MYDOMAIN.COM. However when I set activeDirectoryRealm.principalSuffix = @MYDOMAIN.COM
I cant log in using user1, or even user1@MYDOMAIN.COM (ldap error 49, 52e). When I delete above parameter I can log in using @MYDOMAIN.COM upper, lower or mixed cases.
... View more
07-10-2017
12:53 PM
Hi, I am using Zeppelin 0.7.1 configured AD authentication and HDP 2.5 with Kerberos. When I simply run "show databases" as "user1" I got error: "Permission denied: user [user1@MYDOMAIN.COM] does not have [USE] [...]". On my other cluster with the same configuration "user1" is treated as "user1" (without @MYDOMAIN.COM) so my policies are working good. Any ideas what could be the reason? @EDIT I have also noticed that hive interpreter is logging to separate log file on working cluster, while the cluster with not working Hive interpreter is logging to main Zeppelin log file (where authentication is logged etc.).
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Zeppelin
06-28-2017
06:53 AM
@Jonas Straub @Vikas Gadade Thank you for clarify. Is there any workaround for that? Or is it fixed in HDP2.6? I also use SSSD + Kerberos only on management nodes. On Nodemanager hosts AD users does not exists thus YARN is not working.
... View more
06-20-2017
03:08 PM
1 Kudo
The problem was with case conversion. I got firstly Upper case conversion for groups, then created some policies, then changed to none conversion, but in .json files were still appearing Upper case converted group names
... View more
06-20-2017
12:04 PM
Hi, I have a strange problem. I have configured Ranger Usersync with AD and it works good. All groups, users and group membership are synced correctly. Problem scenario: user1 belongs to groups group1 and group2. When I create policy (no matter what kind of - hdfs, hive, hbase) for group1, user1 has access. But when I create policy for group2, user1 does not have access. Both groups are returned when I run "hdfs groups user1". In Ranger GUI user1 belongs to both groups. The cluster was Kerberized one week ago, and I do not remember if both groups were working before Kerberos. HDP2.5 and Ranger 0.6.0. This is not a sandbox cluster. There are no errors in ranger/usersync logs. Do you have any ideas?
... View more
Labels:
- Labels:
-
Apache Ranger
05-22-2017
09:31 AM
Hi @Sushant Please follow the below link to configure user impersonation in Shell interpreter: https://community.hortonworks.com/articles/81069/how-to-enable-user-impersonation-for-sh-interprete.html
... View more
05-12-2017
10:10 AM
I solved the problem by using MySQL master-master replication + MySQL Router as proxy. It works perfect!
... View more
05-12-2017
10:05 AM
@dvillarreal Nevermind, I found that HDFS_DELEGATION_TOKEN does the job. Thanks for the answer anyway.
... View more
05-11-2017
01:30 PM
@Avijeet Dash As I remember I had problems with Hive View 1.5. Could you try using Hive View version 1.0?
... View more
05-11-2017
10:02 AM
@Tom Stewart Did you define the group names in the [roles] section?
... View more
05-11-2017
09:59 AM
Finally solved. I changed hbase.thrift.kerberos.principal to "HTTP/_HOST@myrealm" and now it works. Thanks for help.
... View more
05-10-2017
09:41 AM
Och ok, thank you for explanation.
... View more
05-10-2017
09:33 AM
@ed day Try this: wget http://public-repo-1.hortonworks.com/HDP/ubuntu14/2.x/updates/2.6.0.3/hdp.list
... View more
05-10-2017
09:21 AM
@frank chen Maybe try to create this file manually and leave empty. Then start Zeppelin server
... View more
05-10-2017
09:16 AM
@Jay SenSharma I have nothing to add. I just wanted to know why ambari-agent should not be installed on that host? Point 3 in your post.
... View more
05-10-2017
09:09 AM
@Nikita Uchitelev I suggest you to install Zeppelin as independent service. That means you download the version of Zeppelin you want, untar, copy the content of Notebooks folder from you current version, edit files zeppelin-site.xml and shiro.ini located in Conf folder. To start Zeppelin use "./bin/zeppelin-daemon.sh start".
... View more
05-10-2017
08:37 AM
@yjiang That helped me aswell. Thank you!
... View more
04-28-2017
09:59 AM
@emaxwell Thank you for your answer. I have enabled HTTP auth (SPNEGO), but the same problem still exists. I have one more question after enabling SPNEGO - what is the username syntax / password for JobHistory, Oozie, YARN and other web applications? I can not log in using any of my kerberos principals (HTTP 403), but I can successfully log in (not providing credentials) using local firefox with X11 forwarding.
... View more
04-27-2017
11:21 AM
@Jay SenSharma Thank you for a quick answer. I can successfully connect to Thrift Server using shell command as "hbase" user: [hbase@<myhost>]#hbase org.apache.hadoop.hbase.thrift.HttpDoAsClient <myhost> 9090 hbase true
So the "hbase" user is authenticated correctly. The problem is that HUE cannot access HBase Thrift. It seems like HUE is using different user than "hbase" to make connection. I am using HUE 3.11.
... View more
04-27-2017
11:21 AM
Hi guys, When I try access HBase in HUE I got the following error: Api Error: Unable to authenticate <Response [401]> And in Thrift Server log: 2017-04-19 10:40:47,312 ERROR [1884538648@qtp-1493988307-4] thrift.ThriftHttpServlet: Kerberos Authentication failed
org.apache.hadoop.hbase.thrift.HttpAuthenticationException: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:139)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doPost(ThriftHttpServlet.java:86)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:767)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:326)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:945)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:756)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:134)
... 16 more
Caused by: org.apache.hadoop.hbase.thrift.HttpAuthenticationException: Kerberos authentication failed:
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:190)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:144)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
... 17 more
Caused by: GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)
at sun.security.jgss.krb5.Krb5Context.acceptSecContext(Krb5Context.java:856)
at sun.security.jgss.GSSContextImpl.acceptSecContext(GSSContextImpl.java:342)
at sun.security.jgss.GSSContextImpl.acceptSecContext(GSSContextImpl.java:285)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:178)
... 21 more
Caused by: KrbException: Checksum failed
at sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType.decrypt(Aes256CtsHmacSha1EType.java:102)
at sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType.decrypt(Aes256CtsHmacSha1EType.java:94)
at sun.security.krb5.EncryptedData.decrypt(EncryptedData.java:175)
at sun.security.krb5.KrbApReq.authenticate(KrbApReq.java:291)
at sun.security.krb5.KrbApReq.<init>(KrbApReq.java:159)
at sun.security.jgss.krb5.InitSecContextToken.<init>(InitSecContextToken.java:108)
at sun.security.jgss.krb5.Krb5Context.acceptSecContext(Krb5Context.java:829)
... 24 more
Caused by: java.security.GeneralSecurityException: Checksum failed
at sun.security.krb5.internal.crypto.dk.AesDkCrypto.decryptCTS(AesDkCrypto.java:451)
at sun.security.krb5.internal.crypto.dk.AesDkCrypto.decrypt(AesDkCrypto.java:272)
at sun.security.krb5.internal.crypto.Aes256.decrypt(Aes256.java:76)
at sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType.decrypt(Aes256CtsHmacSha1EType.java:100)
... 30 more hbase-site.xml: hbase.thrift.security.qop = auth
hbase.thrift.keytab.file = /etc/security/keytabs/hbase.service.keytab
hbase.thrift.kerberos.principal = hbase/_HOST@<myrealm>
Test Thrift server as "hbase" user: [$]hbase org.apache.hadoop.hbase.thrift.HttpDoAsClient <myhost> 9090 hbase true
Debug is true storeKey false useTicketCache true useKeyTab false doNotPrompt true ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is true principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
Refreshing Kerberos configuration
Acquire TGT from Cache
Principal is hbase/<myhost>@<myrealm>
Commit Succeeded
Debug is true storeKey false useTicketCache true useKeyTab false doNotPrompt true ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is true principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
Refreshing Kerberos configuration
Acquire TGT from Cache
Principal is hbase/<myhost>@<myrealm>
Commit Succeeded
scanning tables...
Ticket is: Negotiate YIICcgYJKoZIhvcSAQICAQBuggJhMIICXaADAgEFoQMCAQ6iBwMFACAAAACjggFmYYIBYjCCAV6gAwIBBaEMGwpIQURPT1AuQ09NoiIwIKADAgEAoRkwFxsFaGJhc2UbDmhhZG9vcDEubG9jYWxko4IBIzCCAR+gAwIBEqEDAgEGooIBEQSCAQ0ZZ0Vi9KwMpyA65xvKOwm7bXFnTr4EXwWj7ikQ8U6HPh2RfHwO39T76vyBFzR0D3Oervgpr4jyKyT+o0NYylSKwDr4iPUpZPUeRzi5wWxgb4+bPDB/UwgYzZOMXtv4Ewx8KuSzafv8Nxc/3X32cOD2gXZ2l4DpVO4HcZDZ/7DOmQRAYzXclkIRuWfMyYxqnjx9ebqTph/18e1OrAeADnOYYORPtUHvKDydVVlEO5k0zp0LBdj68TOD40TzX+ED3K8yurXoU3UWuAg6/vGV+5s4T7J5R+7uMolhwjL4utxi95rCzbDgE6bVeOp92SiZUtGZKWcLze1F7SpFIvbSmkrFs94/Laey5+5c+yOY56SB3TCB2qADAgESooHSBIHPcgeIkkTSTYOxT7rZDtuXijHPf3h+j/p8lB6B07Saw4wwA82P6TPesozw0Tl/G4m/mabuyJgDHqHEyxu2/eG0tDD1V3eVs+x8y+EptcGI0wvCaSvK0S4Q8kZ30bRV7NFegtS1LlYbfbXD7zqrX1CByqr3s92DAzuc8CO6yRY18ZNs8aiP0BhVciVT2pwwTl86iA3ZJbW2JsGgnr1uif/0tqqI6yaZvoANVCAk/6LZXZm1LjJiS7BqCFRdWMIs2Ujl3NFzPnD446+s0r/rCxdn found: ATLAS_ENTITY_AUDIT_EVENTS
found: SESSIONS_SECONDARY
found: atlas_titan
found: demo_table I have configured 2 KDCs master/slave and one-way trust to AD @EDIT I am obtaining "Authorization header received from the client is empty." now. HBase Thrift log: 2017-04-27 12:32:49,220 ERROR [1560162680@qtp-1865201235-5] thrift.ThriftHttpServlet: Failed to perform authentication
2017-04-27 12:32:49,220 ERROR [1560162680@qtp-1865201235-5] thrift.ThriftHttpServlet: Kerberos Authentication failed
org.apache.hadoop.hbase.thrift.HttpAuthenticationException: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:139)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doPost(ThriftHttpServlet.java:86)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:767)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:326)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:945)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:756)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:134)
... 16 more
Caused by: org.apache.hadoop.hbase.thrift.HttpAuthenticationException: Authorization header received from the client is empty.
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.getAuthHeader(ThriftHttpServlet.java:212)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:176)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:144)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
... 17 more
2017-04-27 12:32:49,229 ERROR [1560162680@qtp-1865201235-5] thrift.ThriftHttpServlet: Failed to perform authentication
2017-04-27 12:32:49,229 ERROR [1560162680@qtp-1865201235-5] thrift.ThriftHttpServlet: Kerberos Authentication failed
org.apache.hadoop.hbase.thrift.HttpAuthenticationException: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:139)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doPost(ThriftHttpServlet.java:86)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:767)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:326)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:945)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:756)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:134)
... 16 more
Caused by: org.apache.hadoop.hbase.thrift.HttpAuthenticationException: Kerberos authentication failed:
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:190)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:144)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
... 17 more
Caused by: GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)
at sun.security.jgss.krb5.Krb5Context.acceptSecContext(Krb5Context.java:856)
at sun.security.jgss.GSSContextImpl.acceptSecContext(GSSContextImpl.java:342)
at sun.security.jgss.GSSContextImpl.acceptSecContext(GSSContextImpl.java:285)
at org.apache.hadoop.hbase.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:178)
... 21 more
Caused by: KrbException: Checksum failed
at sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType.decrypt(Aes256CtsHmacSha1EType.java:102)
at sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType.decrypt(Aes256CtsHmacSha1EType.java:94)
at sun.security.krb5.EncryptedData.decrypt(EncryptedData.java:175)
at sun.security.krb5.KrbApReq.authenticate(KrbApReq.java:291)
at sun.security.krb5.KrbApReq.<init>(KrbApReq.java:159)
at sun.security.jgss.krb5.InitSecContextToken.<init>(InitSecContextToken.java:108)
at sun.security.jgss.krb5.Krb5Context.acceptSecContext(Krb5Context.java:829)
... 24 more
Caused by: java.security.GeneralSecurityException: Checksum failed
at sun.security.krb5.internal.crypto.dk.AesDkCrypto.decryptCTS(AesDkCrypto.java:451)
at sun.security.krb5.internal.crypto.dk.AesDkCrypto.decrypt(AesDkCrypto.java:272)
at sun.security.krb5.internal.crypto.Aes256.decrypt(Aes256.java:76)
at sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType.decrypt(Aes256CtsHmacSha1EType.java:100)
... 30 more
and Hue error.log: [27/Apr/2017 12:49:20 +0200] views ERROR failed to parse input as json
Traceback (most recent call last):
File "/home/hue/hue/hue/apps/hbase/src/hbase/views.py", line 55, in safe_json_load
return json.loads(re.sub(r'(?:\")([0-9]+)(?:\")', r'\1', str(raw)))
File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[27/Apr/2017 12:49:20 +0200] api ERROR failed to load the HBase clusters
Traceback (most recent call last):
File "/home/hue/hue/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[27/Apr/2017 12:49:20 +0200] views ERROR failed to parse input as json
Traceback (most recent call last):
File "/home/hue/hue/hue/apps/hbase/src/hbase/views.py", line 55, in safe_json_load
return json.loads(re.sub(r'(?:\")([0-9]+)(?:\")', r'\1', str(raw)))
File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[27/Apr/2017 12:49:20 +0200] views ERROR failed to parse input as json
Traceback (most recent call last):
File "/home/hue/hue/hue/apps/hbase/src/hbase/views.py", line 55, in safe_json_load
return json.loads(re.sub(r'(?:\")([0-9]+)(?:\")', r'\1', str(raw)))
File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[27/Apr/2017 12:49:20 +0200] api ERROR failed to load the HBase clusters
Traceback (most recent call last):
File "/home/hue/hue/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[27/Apr/2017 12:49:20 +0200] kerberos_ ERROR authenticate_server(): Authenticate header:
[27/Apr/2017 12:49:20 +0200] kerberos_ ERROR authenticate_server(): authGSSClientStep() failed:
Traceback (most recent call last):
File "/home/hue/hue/hue/build/env/lib/python2.6/site-packages/requests_kerberos-0.6.1-py2.6.egg/requests_kerberos/kerberos_.py", line 229, in authenticate_server
_negotiate_value(response))
GSSError: (('Invalid token was supplied', 589824), ('Success', 100001))
[27/Apr/2017 12:49:20 +0200] kerberos_ ERROR handle_mutual_auth(): Mutual authentication failed
[27/Apr/2017 12:49:20 +0200] thrift_util ERROR Thrift saw exception (this may be expected).
Traceback (most recent call last):
File "/home/hue/hue/hue/desktop/core/src/desktop/lib/thrift_util.py", line 425, in wrapper
ret = res(*args, **kwargs)
File "/home/hue/hue/hue/apps/hbase/gen-py/hbased/Hbase.py", line 53, in decorate
return func(*args, **kwargs)
File "/home/hue/hue/hue/apps/hbase/gen-py/hbased/Hbase.py", line 832, in getTableNames
self.send_getTableNames()
File "/home/hue/hue/hue/apps/hbase/gen-py/hbased/Hbase.py", line 840, in send_getTableNames
self._oprot.trans.flush()
File "/home/hue/hue/hue/build/env/lib/python2.6/site-packages/thrift-0.9.1-py2.6-linux-x86_64.egg/thrift/transport/TTransport.py", line 170, in flush
self.__trans.flush()
File "/home/hue/hue/hue/desktop/core/src/desktop/lib/thrift_/http_client.py", line 84, in flush
self._data = self._root.post('', data=data, headers=self._headers)
File "/home/hue/hue/hue/desktop/core/src/desktop/lib/rest/resource.py", line 132, in post
allow_redirects=allow_redirects, clear_cookies=clear_cookies)
File "/home/hue/hue/hue/desktop/core/src/desktop/lib/rest/resource.py", line 81, in invoke
clear_cookies=clear_cookies)
File "/home/hue/hue/hue/desktop/core/src/desktop/lib/rest/http_client.py", line 173, in execute
raise self._exc_class(ex)
RestException: Unable to authenticate <Response [401]>
... View more
Labels:
- Labels:
-
Apache HBase
-
Cloudera Hue
04-26-2017
08:20 AM
Hi, In a Kerberized cluster I am using 2 local KDC master/slave on Namenode1 and Namenode2. I configured one-way trust to Active Directory. Everything works fine on these Namenodes. Is it possible to configure connection from Datanode to Active Directory via local KDC on Namenode? My Datanode has no connection to AD thus I cant work on Datanodes as user from Active Directory. The thing is that I would like to generate ticket for AD user on the Namenode and work with the same ticket on other hosts, which do not have access to AD (IP address is blocked). Thank you in advance
... View more
Labels:
- Labels:
-
Kerberos
04-24-2017
09:04 AM
2 Kudos
Hi @Stefan Schuster, First of all check your local machine IP address. Then at the end of hosts file add line: machine_ip_address sandbox.hortonworks.com After this http://sandbox.hortonworks.com:port will work. Depends on your local computer OS, the path is different. For Windows the file is in: C:\Windows\System32\drivers\etc\hosts
... View more
04-24-2017
08:53 AM
@HadoopAdmin India Glad to hear you resolved it. Thats why I asked whether you gave access to the resource via Ranger. "XASecurePDPKnox" is Ranger authorization.
... View more