Member since
09-28-2015
73
Posts
26
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7732 | 01-20-2017 01:27 PM | |
3270 | 06-01-2016 08:24 AM | |
3535 | 05-28-2016 01:33 AM | |
2278 | 05-17-2016 03:44 PM | |
1299 | 12-22-2015 01:50 AM |
01-20-2017
12:00 PM
After enabled Kerberos using Ambari wizard, Kafka scripts does not work. Is there any additional configurations to make it work? I am using HDP 2.5.3. $ kinit
$ ./kafka-topics.sh --zookeeper localhost:2181 --create --topic foo --partitions 1 --replication-factor 1
[2017-01-20 11:54:59,482] WARN Could not login: the client is being asked for a password, but the Zookeeper client code does not currently support obtaining a password from the user. Make sure that the client is configured to use a ticket cache (using the JAAS configuration setting 'useTicketCache=true)' and restart the client. If you still get this message after that, the TGT in the ticket cache has expired and must be manually refreshed. To do so, first determine if you are using a password or a keytab. If the former, run kinit in a Unix shell in the environment of the user who is running this Zookeeper client using the command 'kinit <princ>' (where <princ> is the name of the client's Kerberos principal). If the latter, do 'kinit -k -t <keytab> <princ>' (where <princ> is the name of the Kerberos principal, and <keytab> is the location of the keytab file). After manually refreshing your cache, restart this client. If you continue to see this message after manually refreshing your cache, ensure that your KDC host's clock is in sync with this host's clock. (org.apache.zookeeper.client.ZooKeeperSaslClient)
[2017-01-20 11:54:59,484] WARN SASL configuration failed: javax.security.auth.login.LoginException: No password provided Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it. (org.apache.zookeeper.ClientCnxn)
Exception in thread "main" org.I0Itec.zkclient.exception.ZkAuthFailedException: Authentication failure
at org.I0Itec.zkclient.ZkClient.waitForKeeperState(ZkClient.java:946)
at org.I0Itec.zkclient.ZkClient.waitUntilConnected(ZkClient.java:923)
at org.I0Itec.zkclient.ZkClient.connect(ZkClient.java:1230)
at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:156)
at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:130)
at kafka.utils.ZkUtils$.createZkClientAndConnection(ZkUtils.scala:75)
at kafka.utils.ZkUtils$.apply(ZkUtils.scala:57)
at kafka.admin.TopicCommand$.main(TopicCommand.scala:54)
at kafka.admin.TopicCommand.main(TopicCommand.scala)
... View more
Labels:
- Labels:
-
Apache Kafka
12-16-2016
09:22 AM
1 Kudo
Solved this by setting hadoop.proxyuser.root.hosts=*. For some reason, the HDFS request to create the directory was sent from host where neither Ambari Server nor HS2 is running. Not sure why but change this setting solved the issue.
... View more
12-16-2016
06:30 AM
Do not know how to fix it.. Also checked NN log, no error occurred.
... View more
12-16-2016
06:27 AM
I have same issue on HDP 2.5 & Ambari 2.4.0.1. I have created all the necessary HDFS directories and grant proper permission, but a simple 'show tables' query just doesn't work. Digging into HDFS logs, I found Ambari Hive View didn't create staging directory under /user/admin/hive/jobs. It should create hive-job-6-2016-12-16_06-15 directory before trying to write the hql file. $ tail -f hdfs-audit.log | grep '/user/admin'
2016-12-16 06:15:55,156 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.0.178 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-6-2016-12-16_06-15 dst=null perm=null proto=webhdfs This error happens after I enabled Ranger plugin for Hive. I also have another working Ambari Hive View on HDC. It creates the staging directories and hql properly. $ tail -f hdfs-audit.log | grep '/user/admin'
2016-12-16 06:17:29,003 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin dst=null perm=null proto=webhdfs
2016-12-16 06:17:31,148 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/.AUTO_HIVE_INSTANCE.defaultSettings dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,474 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17 dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,486 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.119 cmd=create src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=admin:hdfs:rw-r--r-- proto=rpc
2016-12-16 06:17:35,509 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.120 cmd=create src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=admin:hdfs:rw-r--r-- proto=rpc
2016-12-16 06:17:35,522 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,523 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=open src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,527 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.119 cmd=open src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=null proto=rpc
2016-12-16 06:17:35,582 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,583 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=open src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,587 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.119 cmd=open src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=null proto=rpc
2016-12-16 06:17:35,590 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,593 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,765 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,769 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.119 cmd=open src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=rpc
2016-12-16 06:17:35,771 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,774 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,803 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,807 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.120 cmd=open src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=rpc
2016-12-16 06:17:35,810 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
2016-12-16 06:17:35,812 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
2016-12-16 06:17:45,915 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
2016-12-16 06:17:45,919 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.120 cmd=open src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=rpc
2016-12-16 06:17:45,921 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
2016-12-16 06:17:45,923 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207 cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null perm=null proto=webhdfs
... View more
12-09-2016
12:15 AM
Hi, Does HDC support adding admin user? The use case is to allow individual admin to use their own credential to create/delete clusters.
... View more
11-29-2016
02:59 PM
Thanks, @Ankit Singhal This solved the issue.
... View more
11-28-2016
02:36 PM
Both shipped from HDP2.5, should be the same.
... View more
11-27-2016
02:30 PM
Hi, I could not connect to Phoenix Query Server in HDP2.5 in my Java program using the thin JDBC client. Any advice? My connection string is: jdbc:phoenix:thin:url=http://localhost:8765. I am using the thin driver: org.apache.phoenix.queryserver.client.Driver. try {
Class.forName(driver);
conn = DriverManager.getConnection(connectionString);
} catch(Exception e) {
logger.error("Failed connecting to database.", e);
} I got the below error: java.lang.RuntimeException: org.apache.phoenix.shaded.com.fasterxml.jackson.core.JsonParseException: Unexpected character ('o' (code 111)): Expected space separating root-level values
at [Source:
8org.apache.calcite.avatica.proto.Responses$ErrorResponse�
�org.apache.calcite.avatica.com.google.protobuf.InvalidProtocolBufferException: While parsing a protocol message, the input ended unexpectedly in the middle of a field. This could mean either that the input has been truncated or that an embedded message misreported its own length.
at org.apache.calcite.avatica.com.google.protobuf.InvalidProtocolBufferException.truncatedMessage(InvalidProtocolBufferException.java:70)
at org.apache.calcite.avatica.com.google.protobuf.CodedInputStream.skipRawBytesSlowPath(CodedInputStream.java:1293)
at org.apache.calcite.avatica.com.google.protobuf.CodedInputStream.skipRawBytes(CodedInputStream.java:1276)
at org.apache.calcite.avatica.com.google.protobuf.CodedInputStream.skipField(CodedInputStream.java:197)
at org.apache.calcite.avatica.com.google.protobuf.CodedInputStream.skipMessage(CodedInputStream.java:273)
at org.apache.calcite.avatica.com.google.protobuf.CodedInputStream.skipField(CodedInputStream.java:200)
at org.apache.calcite.avatica.proto.Common$WireMessage.<init>(Common.java:11627)
at org.apache.calcite.avatica.proto.Common$WireMessage.<init>(Common.java:11595)
at org.apache.calcite.avatica.proto.Common$WireMessage$1.parsePartialFrom(Common.java:12061)
at org.apache.calcite.avatica.proto.Common$WireMessage$1.parsePartialFrom(Common.java:12055)
at org.apache.calcite.avatica.com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:89)
at org.apache.calcite.avatica.com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:95)
at org.apache.calcite.avatica.com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
at org.apache.calcite.avatica.proto.Common$WireMessage.parseFrom(Common.java:11791)
at org.apache.calcite.avatica.remote.ProtobufTranslationImpl.parseRequest(ProtobufTranslationImpl.java:354)
at org.apache.calcite.avatica.remote.ProtobufHandler.decode(ProtobufHandler.java:51)
at org.apache.calcite.avatica.remote.ProtobufHandler.decode(ProtobufHandler.java:31)
at org.apache.calcite.avatica.remote.AbstractHandler.apply(AbstractHandler.java:94)
at org.apache.calcite.avatica.remote.ProtobufHandler.apply(ProtobufHandler.java:46)
at org.apache.calcite.avatica.server.AvaticaProtobufHandler.handle(AvaticaProtobufHandler.java:124)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
at org.apache.phoenix.shaded.org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.apache.phoenix.shaded.org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)
at org.apache.phoenix.shaded.org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.apache.phoenix.shaded.org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
... View more
Labels:
- Labels:
-
Apache Phoenix
10-31-2016
06:02 AM
Does ThriftServer in HDP support sharing RDD today?
... View more