Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1499 | 10-11-2018 01:38 AM | |
1868 | 09-26-2018 02:24 AM | |
1828 | 06-29-2018 02:35 PM | |
2419 | 06-29-2018 02:34 PM | |
5367 | 06-20-2018 04:30 PM |
12-07-2016
09:58 PM
1 Kudo
ah it needed an account on the hadoop2 server since hiveserver2 is running there. I created 'sami' on hadoop2 and added it to the hadoop group and then I can use hive using my ticket.
... View more
12-07-2016
09:37 PM
where ? on Linux like below ? hdfs:x:504:hdfs,sami tried the above but same error
... View more
12-07-2016
09:07 PM
yes I see the user 'sami' there , please see the screenshot below
... View more
12-07-2016
01:20 AM
user 'sami' is unix user as well as the KDC ,that's why I can do "kinit sami"
... View more
12-07-2016
01:07 AM
I have given myself full rights on both HDFS and HIVE , yet for some reason I can't connect to HIVE using my ticket 'sami' but if grant myself a 'hive' ticket then I can get into hive . Why ? -bash-4.1$ klist
Ticket cache: FILE:/tmp/krb5cc_600
Default principal: sami@TMY.COM
Valid starting Expires Service principal
12/06/16 19:57:32 12/07/16 19:57:32 krbtgt/TMY.COM@TMY.COM
renew until 12/06/16 19:57:32
-bash-4.1$
-bash-4.1$
-bash-4.1$ hive
Logging initialized using configuration in file:/etc/hive/2.5.0.0-1245/0/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1481054355280_0003 failed 2 times due to AM Container for appattempt_1481054355280_0003_000002 exited with exitCode: -1000
For more detailed output, check the application tracking page: http://hadoop2.my.com:8088/cluster/app/application_1481054355280_0003 Then click on links to logs of each attempt.
Diagnostics: Application application_1481054355280_0003 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is sami
main : requested yarn user is sami
User sami not found
Failing this attempt. Failing the application.
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:536)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:680)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1481054355280_0003 failed 2 times due to AM Container for appattempt_1481054355280_0003_000002 exited with exitCode: -1000
For more detailed output, check the application tracking page: http://hadoop2.my.com:8088/cluster/app/application_1481054355280_0003 Then click on links to logs of each attempt.
Diagnostics: Application application_1481054355280_0003 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is sami
main : requested yarn user is sami
User sami not found
Failing this attempt. Failing the application.
at org.apache.tez.client.TezClient.waitTillReady(TezClient.java:779)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:217)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:117)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:533)
... 8 more
-bash-4.1$
-bash-4.1$
-bash-4.1$ id
uid=600(sami) gid=600(sami) groups=600(sami),501(hadoop)
-bash-4.1$ klist
Ticket cache: FILE:/tmp/krb5cc_600
Default principal: sami@TMY.COM
Valid starting Expires Service principal
12/06/16 19:57:32 12/07/16 19:57:32 krbtgt/TMY.COM@TMY.COM
renew until 12/06/16 19:57:32
-bash-4.1$ kinit hive
Password for hive@TMY.COM:
-bash-4.1$
-bash-4.1$
-bash-4.1$ hive
Logging initialized using configuration in file:/etc/hive/2.5.0.0-1245/0/hive-log4j.properties
hive>
... View more
Labels:
12-06-2016
10:07 PM
this used to work , not anymore [hive@hadoop1 ~]$ kdestroy
[hive@hadoop1 ~]$
[hive@hadoop1 ~]$ id
uid=1004(hive) gid=501(hadoop) groups=501(hadoop)
[hive@hadoop1 ~]$ klist
klist: No credentials cache found (ticket cache FILE:/tmp/krb5cc_1004)
[hive@hadoop1 ~]$ kinit hive
Password for hive@MY.COM:
[hive@hadoop1 ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_1004
Default principal: hive@MY.COM
Valid starting Expires Service principal
12/06/16 17:04:14 12/07/16 17:04:14 krbtgt/MY.COM@MY.COM
renew until 12/06/16 17:04:14
[hive@hadoop1 ~]$
[hive@hadoop1 ~]$ beeline -u 'jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM' -f b.sql
Connecting to jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM
Connected to: Apache Hive (version 1.2.1000.2.5.0.0-1245)
Driver: Hive JDBC (version 1.2.1000.2.5.0.0-1245)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoop2:10000/default> show tables;
Error: Error while compiling statement: FAILED: SemanticException MetaException(message:org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=EXECUTE, inode="/apps/hive/warehouse":hdfs:hdfs:d---------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:307)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)
) (state=42000,code=40000)
Closing: 0: jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM
[hive@hadoop1 ~]$
... View more
Labels:
12-06-2016
09:29 PM
the xa_portal.log file is getting filled with these records , even though the system is not being used ., I using HDP2.5
2016-12-06 16:24:30,044 [http-bio-6080-exec-1] INFO apache.ranger.security.web.filter.RangerKRBAuthenticationFilter (RangerKRBAuthenticationFilter.java:220) - Logged into Ranger as = hdfs
2016-12-06 16:24:30,045 [http-bio-6080-exec-1] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:232) - UserSession Updated to set new Permissions to User: hdfs
2016-12-06 16:24:30,046 [http-bio-6080-exec-1] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:184) - Login Success: loginId=hdfs, sessionId=null, sessionId=1EC316895EBF3A497B9227FED1143A70, requestId=10.100.44.17, epoch=1481059470046
2016-12-06 16:24:44,133 [http-bio-6080-exec-5] INFO apache.ranger.security.web.filter.RangerKRBAuthenticationFilter (RangerKRBAuthenticationFilter.java:220) - Logged into Ranger as = hive
2016-12-06 16:24:44,135 [http-bio-6080-exec-5] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:232) - UserSession Updated to set new Permissions to User: hive
2016-12-06 16:24:44,135 [http-bio-6080-exec-5] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:184) - Login Success: loginId=hive, sessionId=null, sessionId=FFB6C01FD48142647893352C6AFF3CDB, requestId=10.100.44.16, epoch=1481059484135
2016-12-06 16:25:00,055 [http-bio-6080-exec-3] INFO apache.ranger.security.web.filter.RangerKRBAuthenticationFilter (RangerKRBAuthenticationFilter.java:220) - Logged into Ranger as = hdfs
2016-12-06 16:25:00,056 [http-bio-6080-exec-3] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:232) - UserSession Updated to set new Permissions to User: hdfs
2016-12-06 16:25:00,057 [http-bio-6080-exec-3] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:184) - Login Success: loginId=hdfs, sessionId=null, sessionId=C6017523D930A116A3735C92F5397A49, requestId=10.100.44.17, epoch=1481059500057
... View more
Labels:
12-05-2016
11:56 PM
I just checked and Ambari Infra is already installed in my HDP2.5 cluster .
... View more
12-05-2016
11:28 PM
can you guide me on how to install Ambari Infra Solr installation as I didnt find it in the add services ?
... View more