Member since
10-06-2017
12
Posts
0
Kudos Received
0
Solutions
01-27-2021
01:54 AM
@sow I am also having the same issue, did you get any resolution for this issue?
... View more
11-23-2017
07:49 AM
@Matt Clarke Thanks Matt, thanks for your reply. I have solved the problem, I was doing it wrong. I placed the input port in a process group, that is why its was not creating any input ports for the source RPG. Now I placed the input port in the main ui page and it created the remote input port for RPG in source.Thanks... Now I have couple more questions: If we want to transmit data from multiple sources then how to distinguish which flow is coming from which source in target? Is there any way to create multiple input ports in target and customize the flow to send the data to different flows or it will just send all the data from various sources to a single input port and if this is the process then how to distinguish which flow is coming from which source and filter the flow to different flows. And is there any way to create target job in a process group? Thank You Again. 🙂
... View more
11-22-2017
03:40 PM
Hi
I am trying to send data from one nifi to another nifi (which are working in different nodes) using remote process group. So i created one remote process group and mention the url of other nifi host. Now whenever i am trying to connect a getfile or generate flowfile with this RPG its giving me error that it does not have input ports for RPG. I have configured the nifi.properties file as mentioned in here https://nifi.apache.org/minifi/getting-started.html and mention the site to site port number but it still not working.
I am attaching some screenshots regarding my problem. Hoping for a solution. nifi1.png nifi2.png nifi3.png
Thanks in Advance.
Pratik
... View more
Labels:
- Labels:
-
Apache NiFi
11-22-2017
09:07 AM
Yes its running on localhost:9092, i already checked and tested.
... View more
11-17-2017
11:10 AM
I am not getting any error while running the job. Its like consumekafka_0_10 is not getting any flow. When i stopped the processor i got this error 06:06:11 EST
WARNING
b50f49fb-015f-1000-40da-29fda7c03e89
ConsumeKafka_0_10[id=b50f49fb-015f-1000-40da-29fda7c03e89] Was interrupted while trying to communicate with Kafka with lease org.apache.nifi.processors.kafka.pubsub.ConsumerPool$SimpleConsumerLease@30f9983f. Will roll back session and discard any partially received data. Now i checked the log file and the records i got ConsumeKafka_0_10[id=b50f49fb-015f-1000-40da-29fda7c03e89] to run with 1 threads
2017-11-17 06:05:47,352 INFO [Timer-Driven Process Thread-2] o.a.k.clients.consumer.ConsumerConfig ConsumerConfig values:
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [localhost:9092]
check.crcs = true
client.id =
connections.max.idle.ms = 540000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = none
heartbeat.interval.ms = 3000
interceptor.classes = null
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 10000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.ms = 50
request.timeout.ms = 305000
retry.backoff.ms = 100
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
2017-11-17 06:05:47,354 INFO [Timer-Driven Process Thread-2] o.a.kafka.common.utils.AppInfoParser Kafka version : 0.10.2.1
2017-11-17 06:05:47,354 INFO [Timer-Driven Process Thread-2] o.a.kafka.common.utils.AppInfoParser Kafka commitId : e89bffd6b2eff799
2017-11-17 06:05:48,832 INFO [Flow Service Tasks Thread-2] o.a.nifi.controller.StandardFlowService Saved flow controller org.apache.nifi.controller.FlowController@65b5b971 // Another save pending = false
2017-11-17 06:06:04,416 INFO [pool-10-thread-1] o.a.n.c.r.WriteAheadFlowFileRepository Initiating checkpoint of FlowFile Repository
2017-11-17 06:06:05,670 INFO [pool-10-thread-1] org.wali.MinimalLockingWriteAheadLog org.wali.MinimalLockingWriteAheadLog@4c1c9116 checkpointed with 111267 Records and 0 Swap Files in 1253 milliseconds (Stop-the-world time = 101 milliseconds, Clear Edit Logs time = 12 millis), max Transaction ID 172038610
2017-11-17 06:06:05,670 INFO [pool-10-thread-1] o.a.n.c.r.WriteAheadFlowFileRepository Successfully checkpointed FlowFile Repository with 111267 records in 1253 milliseconds
2017-11-17 06:06:06,879 INFO [NiFi Web Server-4688] o.a.n.controller.StandardProcessorNode Stopping processor: class org.apache.nifi.processors.kafka.pubsub.ConsumeKafka_0_10
2017-11-17 06:06:06,879 INFO [StandardProcessScheduler Thread-1] o.a.n.c.s.TimerDrivenSchedulingAgent Stopped scheduling ConsumeKafka_0_10[id=b50f49fb-015f-1000-40da-29fda7c03e89] to run
2017-11-17 06:06:08,651 INFO [Flow Service Tasks Thread-2] o.a.nifi.controller.StandardFlowService Saved flow controller org.apache.nifi.controller.FlowController@65b5b971 // Another save pending = false
2017-11-17 06:06:11,936 WARN [Timer-Driven Process Thread-2] o.a.n.p.kafka.pubsub.ConsumeKafka_0_10 ConsumeKafka_0_10[id=b50f49fb-015f-1000-40da-29fda7c03e89] Was interrupted while trying to communicate with Kafka with lease org.apache.nifi.processors.kafka.pubsub.ConsumerPool$SimpleConsumerLease@30f9983f. Will roll back session and discard any partially received data.
... View more
11-14-2017
07:25 AM
Hi I am using python script and groovy script to do same task. I was trying to check the performance of this two scripts. But i noticed python script is working slow and groovy is working really fast. In this below script i am just replacing the space with "|" in both of the scripts. Python Script: import string
import java.io
from org.apache.commons.io import IOUtils
from java.nio.charset import StandardCharsets
from org.apache.nifi.processor.io import StreamCallback
class PyStreamCallback(StreamCallback):
def __init__(self):
pass
def process(self, inputStream, outputStream):
text = IOUtils.toString(inputStream, StandardCharsets.UTF_8)
text=text.replace(",","|",1).replace(" ","|",3).replace("|"," ",1).replace("] ","]|")
outputStream.write(bytearray(text.encode('utf-8')))
flowFile = session.get()
if(flowFile != None):
try:
flowFile = session.write(flowFile, PyStreamCallback())
session.transfer(flowFile, REL_SUCCESS)
except:
log.error('Something went wrong', e)
session.transfer(flowFile, REL_FAILURE) Groovy Script: import org.apache.nifi.processor.io.StreamCallback
import java.nio.charset.StandardCharsets
def flowFile = session.get()
if(!flowFile) return
flowFile = session.write(flowFile, {inputStream, outputStream ->
inputStream.eachLine { line ->
String[] names = line.split("\\[")
names[0]=(names[0].replace(",", " ")).replace(" ","|")
def a=(names[0]+"["+names[1])
outputStream.write("${a}\n".toString().getBytes(StandardCharsets.UTF_8))
}
} as StreamCallback)
session.transfer(flowFile, REL_SUCCESS)
... View more
Labels:
- Labels:
-
Apache NiFi
11-14-2017
07:16 AM
Hi I am using kafka_2.11-0.10.0.0 for my kafka use case. I am using zookeeper string in GetKafka processor, and broker list in ConsumeKafka_0_10 but its not working.Its not connecting with kafka dont know why. But Getkafka is working fine. akakfskfkasfcapture.png asdasdasdcapture.png
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache NiFi
10-23-2017
02:29 PM
Yes i have connected hive using DBCPConnectionPool, But is there any way to replace those jars with a lower version one?
... View more
10-23-2017
02:27 PM
Its working using DBCPConnectionPool.
... View more
10-23-2017
02:09 PM
Thanks for the Help. Its working.
... View more