Member since 
    
	
		
		
		12-03-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                156
            
            
                Posts
            
        
                26
            
            
                Kudos Received
            
        
                11
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2129 | 11-03-2023 12:17 AM | |
| 4202 | 12-12-2022 09:16 PM | |
| 1588 | 07-14-2022 03:25 AM | |
| 2430 | 07-28-2021 04:42 AM | |
| 3425 | 06-23-2020 10:08 PM | 
			
    
	
		
		
		09-04-2024
	
		
		10:18 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @araujo @bbende @MattWho - do you have any suggestions? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-04-2024
	
		
		07:54 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello @Mais - Were you able to deserialise and consume both key & value ?  In my case I am able to get deserialised value but dont see key anywehere! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-04-2024
	
		
		05:34 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello Experts,  I have a nifi kafka consumer (ConsumeKafka_2_6) where the kafka message body (value/flow file content) and message key (kafka.key in flow file attributes) both are avro serialized as per the Confluent kafka way of  serializing.  When we use "ConvertRecord + AvroReader CS + ConfluentSchemaRegistry CS"   to convert message body (value/flow file content) , it works fine as it is deserilising the magic byte and schema id to the correct value.     But we are trying below to deserilise kafka.key as well the same as value.    Bringing the FF attribute kafka key to content (using ReplaceText processor with ${kafka.key}) and use "onvert Record + Avro Reader + ConfluentSchemaRegistry" to deserialize, in this case nifi is resulting in a wrong schema id - 3567 instead of the correct schema id 3545.      Is it happening because when Nifi reads kafka.key (originally byte array) and pushes to downstream in FF as FF attribute and which is string?  Is there any other way I can fix this or any other right approach?     Thanks in advance!  @bbende @MattWho @mattw   Mahendra 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		08-26-2024
	
		
		05:27 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hello Experts,  We have couple of Eventhub consumers running in nifi 1.16.3 version.  The output connection is configured with default backpressure - 10,000 messages and 1GB.  But I see 'ConsumeAzureEventhub' is keep on consuming data even after crossing backpressure.          What is the reason for this behavior and how to fix this?  @mattw      Thanks  Mahendra        Thanks  Mahendra 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		05-30-2024
	
		
		11:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 What an explanation ! Cleared my doubts.  Thank you so much @MattWho .  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-30-2024
	
		
		08:08 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello Experts,  I see this red color high lighted number "2(1)"on Apache nifi processor.  Is this something related to background process (processor thread) failing or something?          I face the issue of this custom processor getting stuck once in a while, trying to understand the issue.  This processor just invokes an http post endpoint to upload a file.  Any help/suggestion is appreciated.     Thanks,  Mahendra    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		05-14-2024
	
		
		01:13 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @MattWho - would appreciate if you have any comment on this issue. Thanks in advance. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-11-2024
	
		
		02:13 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hello experts,  I am facing an issue in one of the Nifi server where we have multiple consume eventhub flows.  The flow file repository disc is getting full but content and provenance repos are not.   Have attached the screen shot of all repos usage and content of flowfile repo.  journals folder is occupying very large amount of data.          nifi.properties (related to flofile repo):  nifi.flowfile.repository.always.sync=false  nifi.flowfile.repository.checkpoint.interval=2 mins  nifi.flowfile.repository.directory=/flowfile  nifi.flowfile.repository.implementation=org.apache.nifi.controller.repository.WriteAheadFlowFileRepository  nifi.flowfile.repository.partitions=256  nifi.flowfile.repository.retain.orphaned.flowfiles=true  nifi.flowfile.repository.wal.implementation=org.apache.nifi.wali.SequentialAccessWriteAheadLog            Can anyone help me understand what is the issue? how to resolve this?    Thanks,  Mahendra    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		04-22-2024
	
		
		10:05 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello Experts,  I am using Apache Nifi 1.25 now and have a requirement to consume data from a specific offset of a topic partition.  I see in Nifi ConsumeKafka processor to configure only "earliest" or "latest" but not specific offset number, is there any way to achieve this ?     Any info/suggestion would be appreciated.     Thanks,  Mahendra 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Kafka
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		03-26-2024
	
		
		04:29 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 As you have already integrated with Git using GitFlowPersistenceProvider, you should have all your latest flows in git.  So you can create a new directory on your nifi registry machine and clone the git branch inside the new directory and point nifi reg to that new directory.    (Take a backup of metadata db (H2/MySQL/Postgres) before doing this) 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













