Member since 
    
	
		
		
		07-19-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                613
            
            
                Posts
            
        
                101
            
            
                Kudos Received
            
        
                117
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 5012 | 01-11-2021 05:54 AM | |
| 3407 | 01-11-2021 05:52 AM | |
| 8776 | 01-08-2021 05:23 AM | |
| 8345 | 01-04-2021 04:08 AM | |
| 36550 | 12-18-2020 05:42 AM | 
			
    
	
		
		
		12-14-2020
	
		
		07:05 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @toutou      From your hdfs cluster you need hdfs-site.xml and correct configuration for PutHDFS.    You may also need to satisfy creating a user with permissions on the hdfs location.    Please share PutHDFS processor configuration, and error information to allow community members to respond with specific feedback required to solve your issue.     If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.     Thanks,  Steven 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-04-2020
	
		
		05:16 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @SandeepG01    Ahh no fun with bad filenames.  Space in filename is highly not recommended in these days and times.  That said,  a solution you might try is to \  (backslash) the space.   Especially in the context of passing the filename in flowfile attributes.   If you still need to allow spaces and cannot resolve upstream (do not use spaces), i might suggest submitting your experience over on the NiFI jira as a bug:     https://issues.apache.org/jira/projects/NIFI/issues     If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.     Thanks,  Steven 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-01-2020
	
		
		08:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 The problem is that you need something to store the dynamic schemas in.  That is where the Schema Registry comes in as it provides a UI and api to add/update/delete schemas.  These can then be refrenced from NiFi.         It looks like AvroSchemaRegistry allows you to do the similar,  minus the ui/api.   So you would need to create your schema in your flow, as attribute, and send that to AvroRecorderReader configured against AvroSchemaRegistry.  You could use some other data store to store these schemas, but you would need to pull them out into an attribute of the same name configured in the Reader and Registry.     https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-registry-nar/1.12.1/org.apache.nifi.schemaregistry.services.AvroSchemaRegistry/index.html     The latter method does not give you a way to manage all the schemas, which is why I reference the Hortonworks Schema Registry which does include ability to manage, version actual schemas.          
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-01-2020
	
		
		07:49 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 You can leverage "Attributes to send"  or if you stop processor and click + you can add custom attributes right on bottom of processor config.   If you are not getting anything out of response (failure, retry, no-retry, etc) then you definitely have a connectivity issue from Nifi outbound...  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-01-2020
	
		
		05:09 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Ksad  Excellent work to completely show us what you have.  Also, excellent work to test and confirm your request work in postman first.  This is always one of the first things I do to make sure i have a valid test connection and all settings to connect to the API before attempting with InvokeHttp.    When you take this route, and you cannot get a response, this indicates a networking issue with Nifi to [salesforce domain].  You should test command line from NiFi node to [salesforce domain] using curl, wget, telnet, etc.    Next if you can confirm connectivity, attempt to adjust the processor time outs.   Some systems need longer than defaults.  For example I sometimes set to 50 and 150 by just adding a 0 to the 2 values (connection and read timeout).  f it did time out it should throw an error.  You can also set the processor log level to debug to expose more verbose output in the NiFi UI.  Last but not least,  tail the nifi-app.log file while doing all nifi flow debugging.  Sometimes more useful information is found there.     If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.     Thanks,  Steven 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-01-2020
	
		
		04:59 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Vamshi245  Yes,  HandleHttpRequest and HandleHttpResponse are used in tandem.  Behind the processors is a map cache which holds connection session between request/response processors.   If your flowfile (json) coming out of custom HandleHttpRequest is delivered to stock HandleHttpResponse, it will send the json to the original connecting client.     If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.     Thanks,  Steven 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-30-2020
	
		
		11:57 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 As suggested above, update post with your processor, its reader and writier settings.  It sounds like you have something misconfigured.  If possible show us a screen shot of your flow too.    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-30-2020
	
		
		11:51 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Chigoz    Your issue with that sandbox cluster is likely too many services trying to run on too small of an instance/node.  You will need to strategically turn on the components you need individually starting with HDFS first.  If you have issues specific to the sandbox, or certain components starting, you should open a post with those specific errors.         To install HUE, check out my management pack:     https://github.com/steven-matison/dfhz_hue_mpack     Local search for "hue install" topics includes articles reference above hue mpack:     https://community.cloudera.com/t5/forums/searchpage/tab/message?advanced=false&allow_punctuation=false&q=install%20hue     If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.     Thanks,  Steven 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-30-2020
	
		
		05:02 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @naga_satish   yes, what you are looking for is the Schema Registry:        https://docs.cloudera.com/HDPDocuments/HDF3/HDF-3.0.0/bk_schema-registry-user-guide/content/ch_integrating-schema-registry.html     The schema registry can be configured in NiFI, then the schema you create there are available in NiFi Record Readers and Writers.     If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.     Thanks,  Steven 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-30-2020
	
		
		04:59 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 You will need to define your schema in avro format, drop that into the readers/writers.   Here is an example:     {
   "type" : "record",
   "name" : "DailyCSV",
   "fields" : [
      { "name" : "DepartmentName" , "type" : ["string", "null"] },
	  { "name" : "AccountName" , "type" : ["string", "null"] },
	  { "name" : "AccountOwnerId", "type" : ["string", "null"] },
      { "name" : "AdditionalInfo", "type" : [ "null", "string" ] }
]
}    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













