Support Questions

Find answers, ask questions, and share your expertise

kafka connect file import and export

avatar

Hi - i'm trying for kafka file import and export but its failing with timed out.

ERROR Failed to flush WorkerSourceTask{id=local-file-source-0}, timed out while waiting for producer to flush outstanding messages, 1 left ({ProducerRecord(topic=newtest, partition=null, key=[B@63d673d3, value=[B@144e54da=ProducerRecord(topic=newtest, partition=null, key=[B@63d673d3, value=[B@144e54da}) (org.apache.kafka.connect.runtime.WorkerSourceTask:239)

[2017-01-02 05:51:08,891] ERROR Failed to commit offsets for WorkerSourceTask{id=local-file-source-0} (org.apache.kafka.connect.runtime.SourceTaskOffsetCommitter:112)

i check both kafka server and zookeeper and those are running fine.

no other error am seeing in logs..

please help me in fixing the issue

thanks,sathish

1 ACCEPTED SOLUTION

avatar
@sathish jeganathan

You need to add below properties in /etc/kafka/conf/connect-standalone.properties if you are using SASL.

producer.security.protocol=SASL_PLAINTEXT
producer.sasl.kerberos.service.name=kafka
consumer.security.protocol=SASL_PLAINTEXT
consumer.sasl.kerberos.service.name=kafka

View solution in original post

19 REPLIES 19

avatar
@Sandeep Nemuri

i've setup port 6667 with sec.protocol to plaintextsasl.. but kafka connect by default running with producer properties (security.protocol = PLAINTEXT) . how can i override the parms for kafka connect. i've updated the parms in standalone.properties but kafka connect is not taking the parm while starting it. how should change the producer properties for kafka connect ?

thanks,

sathish

avatar

right now i see all below parms in connect-log4j properties

log4j.rootLogger=INFO, stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender

log4j.appender.stdout.layout=org.apache.log4j.PatternLayout

log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c:%L)%n

log4j.logger.org.apache.zookeeper=ERROR

log4j.logger.org.I0Itec.zkclient=ERROR

thanks,

sathish

avatar

yes.. i've started in debug mode.. please gimme some time .. am going through the logs now

avatar
@sathish jeganathan

You need to add below properties in /etc/kafka/conf/connect-standalone.properties if you are using SASL.

producer.security.protocol=SASL_PLAINTEXT
producer.sasl.kerberos.service.name=kafka
consumer.security.protocol=SASL_PLAINTEXT
consumer.sasl.kerberos.service.name=kafka

avatar

yes.. its was problem with the security protocol.. i've changed them now and it started working now. is there any link or doc for parameter reference ?

thanks,

sathish

avatar
@Sandeep Nemuri

right now I'm testing with RDMS source(mysql) and kafka connect is failing for "connector.class". how can i find the correct connector class for rdms(mysql database).

i've tried with org.apache.kafka.connect.jdbc.JdbcSourceConnector,io.confulent.connect.jdbc.JdbcSourceConnector and both are not exist.

thanks,

sathish

avatar
@sathish jeganathan

I'm not sure about the mysql connector. But the class you are using belongs to https://github.com/confluentinc/kafka-connect-jdbc which is not shipped with HDP.

avatar

its not specific to mysql... connector class for "rdbms" source.

thanks,

sathish

avatar

i should have been more specific.. i'm looking for kafka connector class for rdbms as source. can you please let me know the connector class or any doc for reference...

thanks,

sathish

avatar
@sathish jeganathan

I am not sure about the connector, Request you to ask a new question for the same.