Member since
01-05-2017
13
Posts
0
Kudos Received
0
Solutions
04-16-2017
05:41 PM
@Olga Svyryd Hi did you get the issue? I am also facing same problem. Hope you help.
... View more
01-09-2017
04:43 AM
@Jay SenSharma yes sasl is enabled as true in hive-site.xml. But it is still showing error.
... View more
01-08-2017
04:47 AM
@Jay SenSharma yes sasl is enabled as true in hive-site.xml.
... View more
01-07-2017
05:17 AM
@Jay SenSharma
... View more
01-06-2017
11:28 AM
@Jay SenSharma Thanks for the help 🙂 File view is working fine now. But now i am having problem with hive view. Here is the log below.. also i have attached the hive view config setting. 05 Jan 2017 22:16:26,599 ERROR [qtp-ambari-client-618] ServiceFormattedException:95 - H020 Could not establish connecton to hdpdn2.hadoop.com:10000: org.apache.thrift.transport.TTransportException: Peer indicated failure: Unsupported mechanism type GSSAPI: org.apache.thrift.transport.TTransportException: Peer indicated failure: Unsupported mechanism type GSSAPI
05 Jan 2017 22:16:26,600 ERROR [qtp-ambari-client-618] ServiceFormattedException:96 - org.apache.ambari.view.hive.client.HiveClientException: H020 Could not establish connecton to hdpdn2.hadoop.com:10000: org.apache.thrift.transport.TTransportException: Peer indicated failure: Unsupported mechanism type GSSAPI: org.apache.thrift.transport.TTransportException: Peer indicated failure: Unsupported mechanism type GSSAPI Also the port is working fine: [root@hdpdn2 keytabs]# netstat -tulpn|grep 10000
tcp 0 0 0.0.0.0:10000 0.0.0.0:* LISTEN 14800/java
[root@hdpdn2 keytabs]# ps -ef|grep 14800
hive 14800 1 0 15:24 ? 00:00:35 /usr/jdk64/jdk1.8.0_60/bin/java -Xmx1024m -Dhdp.version=2.3.4.0-3485 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.3.4.0-3485 -Dhadoop.log.dir=/var/log/hadoop/hive -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.3.4.0-3485/hadoop -Dhadoop.id.str=hive -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.3.4.0-3485/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -Xmx1203m -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.3.4.0-3485/hive/lib/hive-service-1.2.1.2.3.4.0-3485.jar org.apache.hive.service.server.HiveServer2 --hiveconf hive.aux.jars.path=file:///usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar -hiveconf hive.metastore.uris= -hiveconf hive.log.file=hiveserver2.log -hiveconf hive.log.dir=/var/log/hive The hive view setting:
... View more
01-06-2017
06:26 AM
@Kuldeep Kulkarni i have done the changes still it doesnt work.. i have posted the log file and the view setting above.
... View more
01-06-2017
06:26 AM
@rguruvannagari i have done the changes still it doesnt work.. i have posted the log file and the view setting above.
... View more
01-05-2017
05:15 PM
ambari version-Version2.2.0.0
... View more
01-05-2017
05:12 PM
@Jay SenSharma here is my File view setting and log file as the error. fileview.txt
... View more
01-05-2017
12:27 PM
Hi experts I have 3 nodes hadoop kerberized cluster. But while i am opening view i am getting below error: SIMPLE authentication is not enabled. Available:[TOKEN] Below is the solution i came across as per the link. https://community.hortonworks.com/questions/7896/simple-authentication-is-not-enabled-availabletoke.html As per the above solution i have made 2 changes: "hadoop.security.authentication" to "simple" in the file /etc/hadoop/conf/core-site.xml in ambari. 'hive.server2.authentication' is set to 'None' But while i am able to open the views but node managers and datanodes gives error to kerberos..as below: 2017-01-05 16:12:22,661 WARN authorize.ServiceAuthorizationManager (ServiceAuthorizationManager.java:authorize(119)) - Authorization failed for yarn (auth:SIMPLE) for protocol=interface org.apache.hadoop.yarn.server.api.ResourceTrackerPB, expected client Kerberos principal is nm/hdpdn1.hadoop.com@HADOOP.COM
2017-01-05 16:12:22,661 INFO ipc.Server (Server.java:authorizeConnection(2039)) - Connection from 192.168.56.41:34702 for protocol org.apache.hadoop.yarn.server.api.ResourceTrackerPB is unauthorized for user yarn (auth:SIMPLE)
2017-01-05 16:12:22,661 INFO ipc.Server (Server.java:doRead(850)) - Socket Reader #1 for port 8025: readAndProcess from client 192.168.56.41 threw exception [org.apache.hadoop.security.authorize.AuthorizationException: User yarn (auth:SIMPLE) is not authorized for protocol interface org.apache.hadoop.yarn.server.api.ResourceTrackerPB, expected client Kerberos principal is nm/hdpdn1.hadoop.com@HADOOP.COM] Please help me to have kerberize cluster and views.
... View more
Labels: