Member since 
    
	
		
		
		06-08-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                1049
            
            
                Posts
            
        
                518
            
            
                Kudos Received
            
        
                312
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 11189 | 04-15-2020 05:01 PM | |
| 7087 | 10-15-2019 08:12 PM | |
| 3087 | 10-12-2019 08:29 PM | |
| 11390 | 09-21-2019 10:04 AM | |
| 4290 | 09-19-2019 07:11 AM | 
			
    
	
		
		
		12-22-2017
	
		
		07:09 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 @Bala S  You need to extract the extract_date value from your content and add that value as attribute associated with the flowfile(EX:- if you are having json message then use evaluate json path processor(or) if content is csv then use extract text processor and extract the date value and keep that value to the flowfile).  Once you are done with extracting the value and the attribute is associated with the flowfile.  Then follow the below steps  For testing i am having extract_date attribute with value 2017-12-21 00:17:10.0 associate with the flow file.   Then Use Update Attribute processor with below configs:-  Add new properties for the update attribute processor as  day   ${extract_date:toDate("yyyy-MM-dd HH:mm:ss"):format("MM")} 
  hour   ${extract_date:toDate("yyyy-MM-dd HH:mm:ss"):format("HH")} 
  month   ${extract_date:toDate("yyyy-MM-dd HH:mm:ss"):format("MM")} 
  year  ${extract_date:toDate("yyyy-MM-dd HH:mm:ss"):format("yyyy")}      By using update attribute with the following properties we are dynamically generating year,month,day,hour attributes based on extract_date attribute.  Then use Put HDFS processor  with directory property as  /folder/year=${year}/month=${month}/day=${day}/hour=${hour}/${filename}  We are going to add year,moth,day,hour attributes to the flowfile using update attribute processor as mentioned above, then using those attributes to in directory property.  Now the put hdfs processor will create the directories in HDFS if the directory does not exists.  If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of errors.     
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-22-2017
	
		
		03:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Lukas Müller       If your content is in more than one new line then you need to change the following properties to True. by default these properties are false     Enable DOTALL Mode  false    true  false    Indicates
 that the expression '.' should match any character, including a line 
terminator.  Can also be specified via the embedded flag (?s).        Enable Multiline Mode  false    true  false    Indicates
 that '^' and ' should match just after and just before a line 
terminator or end of sequence, instead of only the beginning or end of 
the entire input.  Can also be specified via the embeded flag (?m).     2.You need to change the below property if your extract content is more than 1024 B(1 KB)     Maximum Capture Group Length  1024  Specifies
 the maximum number of characters a given capture group value can have. 
 Any characters beyond the max will be truncated.     3. To extract whole content of the flowfile add new property by clicking on + sign at right corner  extract_Attribute  (.*) //matches everything 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-22-2017
	
		
		02:01 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							@Lukas Müller i think in your flow you are having 2 json responses  1. before invoke http processor(until 4 processor in the screenshot below)  2. after invoke http processor(after 4 processor in the screenshot)  Flow:-  1. Replace Text //to remove last closed braces using replace text processor)  2. Extract text //extract the content of the flowfile and keep as attribute  3. invoke http   4. Replace Text //To remove first Open braces { for the response of invoke http processor  5. Replace Text //merge the existing content of the flowfile with the extracted attribute(step2)       Let me know if you are not able to understand ..!! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-21-2017
	
		
		04:21 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Lukas Müller,  Use Replace text processor with above Configs as i shown in last attached screenshot then the Output of the processor will give you merged json message. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-21-2017
	
		
		12:45 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Lukas Müller,  For your case we are having 2 json responses and trying to
merge them as one, we need to do some kind of preparing the json messages before doing merging them as one.  1.You need to remove the Closed braces after sensordatavalues
array you need to remove the last close braces } from the json response.  by using replace text processor with search value as (.*)}\Z
and Replacement value as $1 Replacement strategy as Regex Replace.  Configs:-      Then extract the resultant content as attribute to the flow
file.  2.Then trigger invoke http processor once you get the result.  Right now we are having a json response but we need to merge
our extracted output to the response and make it as one json message  3.for json response that you got from above invoke http
processor we need to remove the open braces { with Replace Text processor.  by using search value as ^[{](.*) and Replacement value as
$1 Replacement strategy as Regex Replace.  Configs:-      so the response from the invoke http processor we have removed the first open braces then use   4. Replace Text processor to prepare your required json message now by using Prepend with the attributed that we have extracted with extract text processor.  Example:-  Extract text processor attribute value:- i'm keeping this attribute name as extracted_Attribute for testing  {  "id" : 590929682,  "sampling_rate" : null,  "timestamp" : "2017-12-19 16:00:58",  "location" : {  "id" : 2191,  "latitude" : "51.482",  "longitude" : "7.408",  "country" : "DE"  },  "sensor" : {  "id" : 4355,  "pin" : "1",  "sensor_type" : {  "id" : 14,  "name" : "SDS011",  "manufacturer" : "Nova Fitness"  }  },  "sensordatavalues" : [ {  "id" : 1292433314,  "value" : "90.53",  "value_type" : "P1"  } ]  like this way with out closing braces at the end  Then next replace text processor we are removing the first open braces then the content should be like this way  "coord" : {  "lon" : 8.92,  "lat" : 48.85  },  "weather" : [ {  "id" : 300,  "main" : "Drizzle",  "description" : "light intensity drizzle",  "icon" : "09d"  }, {  "id" : 701,  "main" : "Mist",  "description" : "mist",  "icon" : "50d"  } ],  "base" : "stations",  "main" : {  "temp" : 276.69,  "pressure" : 1033,  "humidity" : 93,  "temp_min" : 276.15,  "temp_max" : 277.15  },  "visibility" : 10000,  "wind" : {  "speed" : 1  },  "clouds" : {  "all" : 75  },  "dt" : 1513765200,  "sys" : {  "type" : 1,  "id" : 4891,  "message" : 0.01,  "country" : "DE",  "sunrise" : 1513754085,  "sunset" : 1513783780  },  "id" : 2812053,  "name" : "Weissach",  "cod" : 200}  Then the next replace text processor which is going to prepare the merged json message properties would be like      For testing purpose i did this but you change the prepend and appends as per your json responses.  This is the way we can do merge 2 json responses and make it as one valid json message by using NiFi processors. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-19-2017
	
		
		03:27 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@Lukas Müller Use EvaluateJSON path processor with below configs:-      then we are adding longitude and latitude attributes to the flowfile as      then use url in invoke http processor as  http://api.openweathermap.org/data/2.5/weather?lat=${location.latitude}&lon=${location.longitude}&APPID=myapikey  If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of errors. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-19-2017
	
		
		04:30 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@Tommy Lathrop To import the Template into NiFi Canvas we need to use specific syntax of curl command and template-instance  curl -i -X POST -H 'Content-Type:application/json' -d '{"originX": 2.0,"originY": 3.0,"templateId": "<template-id>"}' http://localhost:8080/nifi-api/process-groups/<process-group-id>/template-instance  Here is what i tried  I'm having a template in my Downloads folder  1.To upload a template from Downloads folder to NiFi i used below curl with upload method  curl -X POST http://localhost:8080/nifi-api/process-groups/226b1e5f-0160-1000-deb0-791f1a6f1f00/templates/upload -k -v -F template=@/cygdrive/c/shu/Downloads/Count_Loop.xml  in the above curl i'm using POST and my process group id is 226b1e5f-0160-1000-deb0-791f1a6f1f00   ??Why we need to specify process group id while uploading??  Templates are owned by a process group (whether that is the root process group or one nested in the canvas). You can upload templates by making use of the '/process-groups/{id}/templates/upload' to upload a template to a particular process group.  Then i have given the path to the my Count_Loop.xml file and uploaded the template into NiFi Instance.  2.Now we need to import the Uploaded Count_Loop.xml into NiFi canvas  For this case we need to have our Template ID, For this id   Curl http://localhost:8080/nifi-api/flow/templates  Hit the above API and get the Template id for the Uploaded template for my case my template id is 02a4b939-ac14-4fac-9f05-550cc521b317.  Then we need to instantiate the template by using POST method and template-instance  POST
/process-groups/{id}/template-instance  My command is as follows  curl -i -X POST -H 'Content-Type:application/json' -d '{"originX": 2.0,"originY": 3.0,"templateId": "02a4b939-ac14-4fac-9f05-550cc521b317"}' http://localhost:8080/nifi-api/process-groups/226b1e5f-0160-1000-deb0-791f1a6f1f00/template-instance  i'm importing the uploaded Count_Loop.xml into the ProcessGroupID 226b1e5f-0160-1000-deb0-791f1a6f1f00  ,we got template id before so i'm using the  TemplateID 02a4b939-ac14-4fac-9f05-550cc521b317 in the above curl method.   This template instance expects origin x, y and template id for sure. If you are missing any of these parameters then it results Bad request error.  If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of errors. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-18-2017
	
		
		06:47 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							@Tommy Lathrop Upload means Copying (or) uploading the template .XML file into the instance of NiFi.  Once the template is uploaded then the template is ready to be instantiated to the NiFi Canvas.  Import means Loading the copied (or) uploaded .XML file into NiFi Canvas(into root or process group).  Let's consider  You have created a flow in DEV env and then you need to
promote this flow to PROD env,  For this case we are going to create a Template in DEV env
first, then download the template into local.  Then you are going to Upload the the template to PROD env
first so the template will be available in PROD env now.  To add the uploaded template to the NiFi Canvas then you
need to import the template. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-18-2017
	
		
		04:11 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@Lanic For NiFi version 1.1.0.2.1.1.0-2   Put Tcp processor port property doesn't support for Expression language      But the issue got fixed in new versions..  NiFi 1.4:-      Make sure does your NiFi version supports for Expression language for port property or not, if not then you need to hard code the port property value.  Refer to the below Community thread on listen TCP Processor  https://community.hortonworks.com/questions/140765/nifi-expression-language-support-for-port-number-i.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-18-2017
	
		
		02:40 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Rajesh AJ   Use Get File (or) List/FetchFile processors to fetch the file.  then use   1.Split Text Processor(if you are having each url for a line) with line split count as 1      If your file having 4 lines then after split text processor will give seperate flowfiles for each line.  Example:-  Input 1 file(having 4 lines) and output will be 4 flow files(each line as seperate flowfile)  2.Extract Text processor to extract the url's to attributes for the flowfile. According to your file content size you need to change the Maximum Buffer size property and i'm extracting all the contents of the flowfile to url attribute by using regex .*  url
  (.*) //extract the whole content of the flowfile add the content to the flow file url attribute      then use   3.invoke http processor with ${url}      We are going to use the extracted url attribute from extract text processor in invoke http processor.  The extracted url attribute will be changed dynamically according to the flowfile content.  Flow:-  1.Get File
2.Split Text
3.Extract Text
4.Invoke HTTP  If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of errors. 
						
					
					... View more