Member since
08-21-2017
13
Posts
0
Kudos Received
0
Solutions
09-19-2017
01:28 AM
Hi @Matt Clarke, Your explanation was useful for me to build my Nifi flow. But I am experiencing a data loss of 7 records.I have posted about the same in forum. Below is the link: https://community.hortonworks.com/questions/138873/data-loss-found-with-tcp-and-mergecontent-processo.html Can you help me in figuring out the mistake I am doing in configuration of the processors? Currently, I am using PutFile instead of PutHDFS for the sake of easy checks with lines count of merged content. Sravanthi
... View more
09-05-2017
06:37 PM
Hi Everyone, I have a issue with Kafka here. On a cluster I have installed Zookeeper-client and Zookeeper server and are up and running. Along the same I have Kafka-server up and running and on Ambari it shows the health as good and fine. But, while I do execute the producer command for kafka as below, I am getting the below error: Kafka Producer Command: bin/kafka-console-producer.sh --broker-list kafka.broker:6667 --topic testTopic Error Message: [2017-09-05 12:22:02,638] ERROR Error when sending message to topic testTopic with key: null, value: 8 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback) org.apache.kafka.common.errors.TimeoutException: Expiring 2 record(s) for testTopic-0: 1535 ms has passed since batch creation plus linger time e 345[2017-09-05 12:22:05,439] ERROR Error when sending message to topic testTopic with key: null, value: 5 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback) org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for testTopic-0: 1536 ms has passed since batch creation plus linger time Not getting the actual issue here. Any help will be appritiated. Thanks, Sravanthi
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Kafka
08-29-2017
05:23 AM
Thanks a lot Bryan. I did added the new-line using shift+enter on the value field and I am succeeded to get the desired result of batch of records. Earlier I did tried using '\n' and '\\n' for this attribute. Out of curiosity, '\\n' should be equivalent to 'shift+enter'. Please correct me here. Thanks, Sravanthi
... View more
08-27-2017
04:22 AM
Hi, We are using below NiFi processors to generate a sample pipline data flow using kafka: GetFile --> RouteOnAttribute --> publishKafka_0_10 Workflow steps: ----------------------- - We used a csv file (50MB) to be read using 'GetFile' processor. Here, we have provided the folder location of the file. Consider each record is of 1KB size and as such there are 50K records on the file - Which is working. - Then we have connected to 'RouteOnAttribute' processor. To pick only this file from the GetFile processor - Which is working. - Then we did connected to 'publishKafka_0_10' processor. Here, we did provided a topic and started the configured server with properties - Which is working too. Following is the issue we are facing when we are trying publish data into Kafka topic using PublishKafka_0_10 1.2.0 processor of Nifi 1.2.0 in HDF 3.0.1.0-43 : - I did use my Spark-Kafka consumer (which is my custom spark job, I am running in cluster). Here, I did provided the maximum batch fetch size: 30MB and buffer size as 15MB. - But, while running the spark job, I am getting only 10 records as my batch size (as my spark consumer is consuming 10 records as a batch). NOTE: I tried using custom kafka producer (sample kafka producer code) which can produce set of messages (1KB/record) by iterating from 1 - 50K count. Here, using the same spark-consumer, I did received maximum set of records as a batch (15K records/batch). Seems the issue with publishKafka processor, which is able to send only few records to the topic. Is there any way I can tune the parameters to achieve maximum throughput writes to a topic using this processor ?? Please find attached were configuration I did used for publishKafka and spark-kafka consumer (custom spark code): Thanks in advance, Sravanthi
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache NiFi