Created 01-02-2017 06:05 AM
Hi - i'm trying for kafka file import and export but its failing with timed out.
ERROR Failed to flush WorkerSourceTask{id=local-file-source-0}, timed out while waiting for producer to flush outstanding messages, 1 left ({ProducerRecord(topic=newtest, partition=null, key=[B@63d673d3, value=[B@144e54da=ProducerRecord(topic=newtest, partition=null, key=[B@63d673d3, value=[B@144e54da}) (org.apache.kafka.connect.runtime.WorkerSourceTask:239)
[2017-01-02 05:51:08,891] ERROR Failed to commit offsets for WorkerSourceTask{id=local-file-source-0} (org.apache.kafka.connect.runtime.SourceTaskOffsetCommitter:112)
i check both kafka server and zookeeper and those are running fine.
no other error am seeing in logs..
please help me in fixing the issue
thanks,sathish
Created 01-02-2017 12:06 PM
You need to add below properties in /etc/kafka/conf/connect-standalone.properties if you are using SASL.
producer.security.protocol=SASL_PLAINTEXT producer.sasl.kerberos.service.name=kafka consumer.security.protocol=SASL_PLAINTEXT consumer.sasl.kerberos.service.name=kafka
Created 01-02-2017 10:49 AM
i've setup port 6667 with sec.protocol to plaintextsasl.. but kafka connect by default running with producer properties (security.protocol = PLAINTEXT) . how can i override the parms for kafka connect. i've updated the parms in standalone.properties but kafka connect is not taking the parm while starting it. how should change the producer properties for kafka connect ?
thanks,
sathish
Created 01-02-2017 09:58 AM
right now i see all below parms in connect-log4j properties
log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c:%L)%n
log4j.logger.org.apache.zookeeper=ERROR
log4j.logger.org.I0Itec.zkclient=ERROR
thanks,
sathish
Created 01-02-2017 10:02 AM
yes.. i've started in debug mode.. please gimme some time .. am going through the logs now
Created 01-02-2017 12:06 PM
You need to add below properties in /etc/kafka/conf/connect-standalone.properties if you are using SASL.
producer.security.protocol=SASL_PLAINTEXT producer.sasl.kerberos.service.name=kafka consumer.security.protocol=SASL_PLAINTEXT consumer.sasl.kerberos.service.name=kafka
Created 01-03-2017 04:35 AM
yes.. its was problem with the security protocol.. i've changed them now and it started working now. is there any link or doc for parameter reference ?
thanks,
sathish
Created 01-03-2017 06:57 AM
right now I'm testing with RDMS source(mysql) and kafka connect is failing for "connector.class". how can i find the correct connector class for rdms(mysql database).
i've tried with org.apache.kafka.connect.jdbc.JdbcSourceConnector,io.confulent.connect.jdbc.JdbcSourceConnector and both are not exist.
thanks,
sathish
Created 01-03-2017 08:38 AM
I'm not sure about the mysql connector. But the class you are using belongs to https://github.com/confluentinc/kafka-connect-jdbc which is not shipped with HDP.
Created 01-03-2017 08:48 AM
its not specific to mysql... connector class for "rdbms" source.
thanks,
sathish
Created 01-03-2017 08:56 AM
i should have been more specific.. i'm looking for kafka connector class for rdbms as source. can you please let me know the connector class or any doc for reference...
thanks,
sathish
Created 01-03-2017 02:33 PM
I am not sure about the connector, Request you to ask a new question for the same.