Member since
09-10-2016
82
Posts
6
Kudos Received
9
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6446 | 08-28-2019 11:07 AM | |
5953 | 12-21-2018 05:59 PM | |
3089 | 12-10-2018 05:16 PM | |
2580 | 12-10-2018 01:03 PM | |
1690 | 12-07-2018 08:04 AM |
11-30-2019
05:02 AM
@ManuelCalvo Changed WARN to DEBUG and ran the kafka producer. Please find the details below: jaas.conf KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/home/user.keytab"
storeKey=true
useTicketCache=false
debug=true
serviceName="kafka"
principal="user@domain.COM";
};
Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/home/<user>/user.keytab"
storeKey=true
useTicketCache=false
debug=true
serviceName="zookeeper"
principal="user@domain.COM";
}; client.propertis security.protocol=SASL_PLAINTEXT
sasl.kerberos.service.name=kafka kafka-producer: [<user>@server ~]$export KAFKA_OPTS="-Djava.security.auth.login.config=/home/<user>/jaas.conf -Djava.security.krb5.conf=/etc/krb5.conf -Dsun.security.krb5.debug=true"
[<user>@server ~]$ $KAFKA_HOME/bin/kafka-console-producer.sh --broker-list $BROKER_LIST --producer.config /home/<user>/client.properties --topic testtopic full error log: [<user>@server ~]$ $KAFKA_HOME/bin/kafka-console-producer.sh --broker-list $BROKER_LIST --producer.config /home/<user>/client.properties --topic testtopic
[2019-11-30 12:52:57,917] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2019-11-30 12:52:57,977] INFO ProducerConfig values:
acks = 1
batch.size = 16384
bootstrap.servers = [server1:9092, server2:9092, server3:9092, server4:9092]
buffer.memory = 33554432
client.id = console-producer
compression.type = none
connections.max.idle.ms = 540000
enable.idempotence = false
interceptor.classes = []
key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
linger.ms = 1000
max.block.ms = 60000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 1500
retries = 3
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = kafka
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = SASL_PLAINTEXT
send.buffer.bytes = 102400
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
(org.apache.kafka.clients.producer.ProducerConfig)
[2019-11-30 12:52:57,997] DEBUG Added sensor with name bufferpool-wait-time (org.apache.kafka.common.metrics.Metrics)
[2019-11-30 12:52:58,000] DEBUG Added sensor with name buffer-exhausted-records (org.apache.kafka.common.metrics.Metrics)
[2019-11-30 12:52:58,191] DEBUG Updated cluster metadata version 1 to Cluster(id = null, nodes = [server1:9092 (id: -2 rack: null), server2:9092 (id: -1 rack: null), server3:9092 (id: -4 rack: null), server4:9092 (id: -3 rack: null)], partitions = [], controller = null) (org.apache.kafka.clients.Metadata)
[2019-11-30 12:52:58,210] INFO [Producer clientId=console-producer] Closing the Kafka producer with timeoutMillis = 0 ms. (org.apache.kafka.clients.producer.KafkaProducer)
[2019-11-30 12:52:58,211] DEBUG [Producer clientId=console-producer] Kafka producer has been closed (org.apache.kafka.clients.producer.KafkaProducer)
org.apache.kafka.common.KafkaException: Failed to construct kafka producer
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:457)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:304)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:45)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:153)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:140)
at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:65)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:88)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:414)
... 3 more
Caused by: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user
at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:940)
at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760)
at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:60)
at org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:103)
at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:65)
at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:125)
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:142)
... 7 more
[<user>@server ~]$
... View more
11-25-2019
11:44 AM
Hi @ManuelCalvo , Yes, keytab has right permission to get the valid ticket. I tried taking the ticket manually & it works fine. What I observed here was environment variable KAFKA_OPTS was ignored by kafka clients.The console producer/consumer should work with the KAFKA_OPTS environment variable that is expected to have priority over the system variables; I exported KAFKA_OPTS pointing to the JAAS file and Kerberos client configuration file, but it's not working!!!!! Kafka-version : 2.0.0.3 export KAFKA_OPTS="-Djava.security.auth.login.config=/home/<user>/jaas.conf -Djava.security.krb5.conf=/etc/krb5.conf" error: Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:153) If I pass SASL parameters as below in the client.properties, I'm able to produce/consume data from Topics without any issue. $KAFKA_HOME/bin/kafka-console-producer.sh --broker-list $BROKER_LIST --topic testtopic --producer.config /home/<user>/client.properties $cat client.properties
security.protocol=SASL_PLAINTEXT
sasl.mechanism=GSSAPI
#sasl.kerberos.service.name=kafka
sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \
useKeyTab=true \
storeKey=true \
keyTab="/home/<user>/<user>.keytab" \
useTicketCache=false \
serviceName="kafka" \
principal="user@domain.COM"; Any idea why export KAFKA_OPTS is not working here? Thank you
... View more
11-23-2019
02:57 AM
I'm getting below error message while trying to produce data from kafka topic in the kerberized HDP cluster.
Error:
DEBUG [Producer clientId=console-producer] Kafka producer has been closed (org.apache.kafka.clients.producer.KafkaProducer) org.apache.kafka.common.KafkaException: Failed to construct kafka producer at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:457) Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user
Stack:
HDP 3.1.0
Kafka 1.0.0.3.1 $KAFKA_HOME="/usr/hdp/3.1.0.0-78/kafka"
$BROKER_LIST="<broker-list>"
$ZK_HOSTS="<zk-host-list>:2181/kafka"
$export KAFKA_OPTS="-Djava.security.auth.login.config=/home/<user>/jaas.conf -Djava.security.krb5.conf=/etc/krb5.conf -Djavax.security.auth.useSubjectCredsOnly=true -Dsun.security.krb5.debug=true"
$export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/home/<user>/jaas.conf -Dsun.security.krb5.debug=true"
$cat jaas.conf
---using user keytab & principal for authentication and disabled useTicketCache---
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/home/<user>/user.keytab"
storeKey=true
useTicketCache=false
serviceName="kafka"
principal="user@domain.COM";
};
Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/home/<user>/user.keytab"
storeKey=true
useTicketCache=false
serviceName="zookeeper"
principal="user@domain.COM";
};
$cat client.properties
security.protocol=SASL_PLAINTEXT
sasl.mechanism=GSSAPI
sasl.kerberos.service.name=kafka
$klist
~]$ klist
klist: No credentials cache found (filename: /tmp/krb5cc_121852)
$kafka-console-producer.sh
$KAFKA_HOME/bin/kafka-console-producer.sh --broker-list <broker-list>:9092 --topic testtopic --producer.config /home/<user>/client.properties
full error log:
[2019-11-23 10:05:45,614] DEBUG Added sensor with name bufferpool-wait-time (org.apache.kafka.common.metrics.Metrics) [2019-11-23 10:05:45,617] DEBUG Added sensor with name buffer-exhausted-records (org.apache.kafka.common.metrics.Metrics) [2019-11-23 10:05:45,620] DEBUG Updated cluster metadata version 1 to Cluster(id = null, nodes = [sl975iaehdp0401.visa.com:9092 (id: -1 rack: null)], partitions = [], controller = null) (org.apache.kafka.clients.Metadata) [2019-11-23 10:05:45,637] INFO [Producer clientId=console-producer] Closing the Kafka producer with timeoutMillis = 0 ms. (org.apache.kafka.clients.producer.KafkaProducer) [2019-11-23 10:05:45,638] DEBUG [Producer clientId=console-producer] Kafka producer has been closed (org.apache.kafka.clients.producer.KafkaProducer) org.apache.kafka.common.KafkaException: Failed to construct kafka producer at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:457) at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:304) at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:45) at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala) Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:153) at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:140) at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:65) at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:88) at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:414) ... 3 more Caused by: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:940) at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760) at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) at javax.security.auth.login.LoginContext.login(LoginContext.java:587) at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:60) at org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:103) at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:65) at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:125) at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:142) ... 7 more
Could you please help on this.
Thank you.
... View more
Labels:
11-08-2019
05:58 AM
Can you please assist on this. Thanks
... View more
11-06-2019
02:58 AM
Hi,
We are getting the error while executing Hive query on spark as execution engine.
Hive version: 1.2.1, Spark version : 1.6
set hive.execution.engine=spark; set spark.home=/usr/hdp/current/spark-client; set hive.execution.engine=spark; set spark.master=yarn-client; set spark.eventLog.enabled=true; set spark.executor.memory=512m; set spark.executor.cores=2; set spark.driver.extraClassPath=/usr/hdp/current/hive-client/lib/hive-exec.jar;
Query ID = svchdpir2d_20191106105445_a9ebc8a2-9c28-4a3d-ac5e-0a8609e56fd5 Total jobs = 1 Launching Job 1 out of 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Starting Spark Job = c6cc1641-20ad-4073-ab62-4f621ae595c8 Status: SENT Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html#release for an explanation.
Could you please help on this.
Thank you
... View more
Labels:
08-28-2019
11:07 AM
Hi @jsensharma, After changing SPARK_HISTORY_OPTS in Advanced spark2-env as below, spark2 history server UI started working. Do you think below config change is the fix for this issue? please advise. Thanks From: export SPARK_HISTORY_OPTS='-Dspark.ui.filters=org.apache.hadoop.security.authentication.server.AuthenticationFilter -Dspark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.params="type=kerberos,kerberos.principal={{spnego_principal}},kerberos.keytab={{spnego_keytab}}"' To: export SPARK_HISTORY_OPTS='-Dspark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.params="type=kerberos,kerberos.principal={{spnego_principal}},kerberos.keytab={{spnego_keytab}}"'
... View more
08-28-2019
06:50 AM
Hi @jsensharma , This is the first time we are deploying spark2 in our cluster. Thanks
... View more
08-28-2019
06:19 AM
Hi @jsensharma , Please find the java version below: lrwxrwxrwx. 1 root root 28 Jul 31 16:20 latest -> /usr/java/jdk1.8.0_221-amd64 $ ps -ef | grep spark | grep -v grep
/usr/java/jdk1.8.0_221-amd64/bin/java -Dhdp.version=3.1.0.0-78 -cp /usr/hdp/current/spark2-historyserver/conf/:/usr/hdp/current/spark2-historyserver/jars/*:/usr/hdp/3.1.0.0-78/hadoop/conf/ -Dspark.ui.filters=org.apache.hadoop.security.authentication.server.AuthenticationFilter -Dspark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.params=type=kerberos,kerberos.principal=HTTP/xxxx.xxxx.com@CORPxx.xxx.COM,kerberos.keytab=/etc/security/keytabs/spnego.service.keytab -Xmx2048m org.apache spark.deploy.history.HistoryServer $java -version
java version "1.8.0_221"
Java(TM) SE Runtime Environment (build 1.8.0_221-b27)
Java HotSpot(TM) 64-Bit Server VM (build 25.221-b27, mixed mode) Thanks
... View more
08-28-2019
06:07 AM
We have installed Spark2 in HDP 3.1 but when we are trying to access spark2 history server UI,we are getting below issue. HTTP ERROR 403
Problem accessing /. Reason:
java.lang.IllegalArgumentException log: spark-spark-org.apache.spark.deploy.history.HistoryServer-1-xxxxxx.visa.com.out 19/08/28 13:01:21 DEBUG AuthenticationFilter: Request [http://xxxxxxxx:18081/favicon.ico] triggering authentication. handler: class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler
19/08/28 13:01:21 DEBUG AuthenticationFilter: Authentication exception: java.lang.IllegalArgumentException
org.apache.hadoop.security.authentication.client.AuthenticationException: java.lang.IllegalArgumentException
at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:306)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:536)
at org.spark_project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
at org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
at org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
at org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
at org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
at org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:448)
at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.spark_project.jetty.server.Server.handle(Server.java:539)
at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:333)
at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
at org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
at org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:108)
at org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
at org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException
at java.nio.Buffer.limit(Buffer.java:275)
at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:365)
at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:358)
at org.apache.hadoop.security.authentication.util.KerberosUtil.getTokenServerName(KerberosUtil.java:291)
at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:285)
... 22 more HDP - HDP 3.1
Spark 2.3
Kerberized cluster Could you please help on this. Thank you
... View more
Labels:
07-15-2019
04:13 PM
Hi, The below properties are working in hive shell but not in beeline and getting Error: Error while processing statement: Cannot modify mapred.compress.map.output at runtime. It is not in list of params that are allowed to be modified at runtime SET mapred.output.compress SET mapred.compress.map.output How to whitelist these properties for beeline. Thank you.
... View more
Labels:
- Labels:
-
Apache Hive