Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2568 | 11-01-2016 05:43 PM | |
| 8502 | 11-01-2016 05:36 PM | |
| 4860 | 07-01-2016 03:20 PM | |
| 8182 | 05-25-2016 11:36 AM | |
| 4335 | 05-24-2016 05:27 PM |
01-05-2017
11:44 PM
we are having the same issue with and our HDP version is 2.4.2, here are all the setting we have already implemented. Our beeline works for all users. There is no permission issue either. Here is the error logs and I have already attached few settings from our environment. 2017-01-03 10:04:22,851 INFO [HiveServer2-Handler-Pool: Thread-67181]: thrift.ThriftCLIService (ThriftCLIService.java:OpenSession(294)) - Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V1
2017-01-03 10:04:22,854 WARN [HiveServer2-Handler-Pool: Thread-67181]: thrift.ThriftCLIService (ThriftCLIService.java:OpenSession(308)) - Error opening session:
org.apache.hive.service.cli.HiveSQLException: Failed to validate proxy privilege of tabsrvtest for btaylo
at org.apache.hive.service.auth.HiveAuthFactory.verifyProxyAccess(HiveAuthFactory.java:379)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getProxyUser(ThriftCLIService.java:731)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getUserName(ThriftCLIService.java:367)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:394)
at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:297)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1257)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1242)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:562)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.security.authorize.AuthorizationException: User: tabsrvtest is not allowed to impersonate
btaylo
at org.apache.hadoop.security.authorize.DefaultImpersonationProvider.authorize(DefaultImpersonationProvider.java:119)
at org.apache.hadoop.security.authorize.ProxyUsers.authorize(ProxyUsers.java:102)
at org.apache.hadoop.security.authorize.ProxyUsers.authorize(ProxyUsers.java:116)
at org.apache.hive.service.auth.HiveAuthFactory.verifyProxyAccess(HiveAuthFactory.java:375)
... 13 more
2017-01-03 10:04:22,866 WARN [HiveServer2-Handler-Pool: Thread-67181]: thrift.ThriftCLIService (ThriftCLIService.java:CloseSession(456)) - Error closing session:
java.nio.BufferUnderflowException
at java.nio.Buffer.nextGetIndex(Buffer.java:506)
at java.nio.HeapByteBuffer.getLong(HeapByteBuffer.java:412)
at org.apache.hive.service.cli.HandleIdentifier.<init>(HandleIdentifier.java:46)
at org.apache.hive.service.cli.Handle.<init>(Handle.java:38)
at org.apache.hive.service.cli.SessionHandle.<init>(SessionHandle.java:45)
at org.apache.hive.service.cli.SessionHandle.<init>(SessionHandle.java:41)
at org.apache.hive.service.cli.thrift.ThriftCLIService.CloseSession(ThriftCLIService.java:447)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$CloseSession.getResult(TCLIService.java:1277)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$CloseSession.getResult(TCLIService.java:1262)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:562)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745) core-site.pnghive-settings.png
... View more
05-30-2016
12:46 PM
Hi, If I understand we can start multiple nfs gateway server on multiple servers (datanode, namenode, client hdfs). if we have (servernfs01, servernfs02, servernfs03) and (client01, client02) client01# : mount -t nfs servernfs01:/ /test01
client02# : mount -t nfs servernfs02:/ /test02 My question is how to avoir a service interruption ? What's happened if servernfs01 is failed ? How to keep access to hdfs for client01, in this case ?
... View more
02-11-2016
10:52 PM
@PJ Moutrie Please see this As mentioned earlier, Hive hook is in place.
http://www.slideshare.net/hortonworks/data-governance-atlas-7122015
... View more
02-14-2017
06:13 PM
Install HBase Client on that node after that you can able to launch HBase Shell
... View more
05-24-2017
11:58 AM
I have a small article to start with kafka. I have tried to keep it simple and precise. https://www.linkedin.com/pulse/introduction-kafka-using-nodejs-pankaj-panigrahi
... View more
02-11-2016
10:37 PM
@vperiasamy I did not cleanup the db manually. 2) Good point. 3) I did use the same DB
... View more
03-04-2016
12:08 AM
1 Kudo
Does it mean if the cluster is kerberized, we don't need Knox ? Only Ranger installation is enough.
... View more
06-02-2016
02:33 PM
Thanks Neeraj! I also set up LDAP through freeIPA service and configured LDAP in Hive in Ambari. Below links helped me a lot to do the setup. https://github.com/hortonworks-gallery/ambari-freeipa-service https://github.com/abajwa-hw/security-workshops/blob/master/Setup-Ambari.md#authentication-via-ldap ** If you installed LDAP through freeIPA link given above, you have to set baseDN as “cn=users,cn=accounts,dc=hortonworks,dc=com” in Ambari Properties. One can try ldap search command "ldapsearch -h localhost:389 -w hortonworks -x -b 'dc=hortonworks,dc=com' uid=ali" after successful LDAP configuration.
... View more
02-11-2016
04:56 AM
1 Kudo
@Sunile Manjee -d statusdir=<> this option is used to specify the hdfs directory to store the log files.( stdout,stderr) for the executed job.
... View more