Member since
07-29-2018
8
Posts
0
Kudos Received
0
Solutions
08-07-2018
01:20 AM
@dbains Hi, here is more info. kafka server.properties file content: listeners=SASL_PLAINTEXT://host:port security.inter.broker.protocol=SASL_PLAINTEXT
sasl.mechanism.inter.broker.protocol=PLAIN sasl.enabled.mechanism=PLAIN flume.log file:
..........
Date, 310 ERROR [lifecycleSupervisor-1-2]
(org.apache.flume.lifecycle.Lifecycle.Supervisor$MonitorRunnable.run:251) - Unable to start PollableSourceRunner : {source:org.apache.flume.source.kafka.KafkaSource{name:source1, state:IDLE} counterGroup:{name:null counters:{}}} - Exception follows.
java.lag.NoSuchMethodError: org.apache.flume.source.kafka.KafkaConsumer.subscribe(Ljava/util/List;Lorg/apache/clients/consumer/ConsumerRebalanceListener;)
.......... Regardless the error telling about "java.lag.NoSuchMethodError: org.apache.flume.source.kafka.KafkaConsumer.subscribe(...)" I looked at the flume's lib folder kafka-client jar file and made sure the method subscribe(...) is in there. So, the flume's agent still unable to function properly with kafka broker secured with SASL/PLAIN
... View more
08-04-2018
02:18 AM
@dbains Hi, I appreciate your quick response dbains. Let me make more clear with my use case: (1)Apache kafka_2.11-0.10.2.0 (which is not a component of HDP-2.6.5) is running on a Linux server. Flume apache-flume-1.8.0 is running on another Linux server. So, Kafka broker and Fflume agent are running on separate Linux servers/machines/boxes. Why not, if needed ports of Kafka/Zookeeper and Flume are opened for communications between servers? (2)Kafka security protocol is SASL_PLAINTEXT and sasl mechanism is PLAIN. There is NO Kerberos security setup/enabled on Kafka broker. Taking into account (1) and (2), why you are confused with "Flume and Kafka are running on different RHEL servers and Kafka secured with the security.protocol=SASL_PLAINTEXT" ? (3)Flume 1.8 User Guide https://flume.apache.org/FlumeUserGuide.html states in my case to use a1.sources.s1.kafka.consumer.security.protocol=SASL_PLAINTEXT and a1.sources.s1.kafka.consumer.sasl.mechanism=PLAIN Why should I use a1.sources.s1.kafka.consumer.security.protocol=PLAINTEXT ? (4)Now with the above mentioned Kafka and Flume setup I'm experiencing the following issues: (4-a)When I start Flume agent then I'm getting the ERROR ..... could Not find PlainLoginModule. Browsing different Flume discussions I concluded that kafka-clients-0.9.0.1.jar does Not contain the PlainLoginModule. (4-b)When I replaced original kafka-clients-0.9.0.1.jar with the kafka-clients-0.10.2.0.jar then Im getting the ERROR ... Unable to start PollableSourceRunner:...{name:s1, state:IDLE} counterGroup ... and this is my current issue I'm unable to resolve. Once again, I appreciate your help dbain and will try to run Flume agent with the a1.sources.s1.kafka.consumer.security.protocol=PLAINTEXT as you advised.i Also, I'll double check and let you know about your question "listeners property in your Kafka broker and let me know what security protocol is it listening to". I would appreciate any help to resolve this issue or point me out to any helpful information how to properly run Apache kafka_2.11-0.10.2.0 broker with Flume 1.8 and with Kafkasecurity protocol SASL_PLAINTEXT and sasl mechanism PLAIN (NO KERBEROS enabled). Thanks in advance.
... View more
08-03-2018
12:46 AM
Environment: apache-flume-1.8.0, kafka 2_2.11-0.10.2.0 Flume and Kafka are running on different RHEL servers and Kafka secured with the security.protocol=SASL_PLAINTEXT and sasl.mechanism=PLAIN flume_conf.properties contains: a1.sources.s1.kafka.consumer.security.protocol=SASL_PLAINTEXT a1.sources.s1.kafka.consumer.sasl.mechanism=PLAIN flume_plain_jaas.conf contains: KafkaClient { org.apache.kafka.common.security.plain.PlainLogin.Module required username="user_name" password="user_pwd" } flume-env.sh contains: JAVA_OPTS="JAVA_OPTS -Djava.security.auth.login.config=/path_to_jaas/flume_plain_jaas.conf" Flume agent is not starting and produces the error: ERROR ...Unexpected error performing start ... *common.KafkaException: Failed to construct kafka consumer ... Caused by org.apache.kafka.common.Kafka Exception: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config ... It seems to me the serviceName is relevant to SASL_PLAINTEXT with Kerberos on Kafka, but this kafka broker/instance does Not use Kerberos. Any help to resolve this issue will be appreciated. Thanks
... View more
Labels:
- Labels:
-
Apache Flume
08-02-2018
01:33 AM
I'm running Flume-1.8 with the spoolDir as flume source and hdfs as a sink. I need to process log files in spoolDir with the lines containing more than 10,000 characters and guarantee the file in the hdfs will contain exactly the same number of lines. According to the Flume-1.8 User Guide the DEFAULT Maximum number of characters to include in a single event is 2048. If a line exceeds this length, it is truncated, and the remaining characters on the line will appear in a subsequent event. I need to set this value of deserializer.maxLineLength as bigger as possible to prevent input file's lines truncating. Is there any limit for the value of deserializer.maxLineLength in Flume-1.8 ? Thanks
... View more
Labels:
- Labels:
-
Apache Flume