Member since 
    
	
		
		
		03-31-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                9
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                1
            
            
                Solution
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 5097 | 04-14-2018 08:14 PM | 
			
    
	
		
		
		06-12-2019
	
		
		10:44 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @shu, Thanks for the solution. It worked for me. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-29-2019
	
		
		11:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello  I have a flow which reads certain parameters and decide if it is a database ingestion or file ingestion. DB ingestions are working fine but for file ingestions i am not able to connect the flow with FetchFile/GetFile/List file processors Looks like these processors does not allow input port to connect. Is there any way or processor to get this done...? My File Ingestion templates is designed to take the input from different folders/files based on input parameters from outerflow. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache NiFi
 
			
    
	
		
		
		05-29-2019
	
		
		11:14 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @shu, Thanks for the input. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2019
	
		
		05:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks @Shu  Its a great input. I have an other question. Is there any possibility to Parameterize Scheduling Parameters also..? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-24-2019
	
		
		11:15 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Is there any way to parameterize Nifi DBConnection Services..? I want to use the same nifi flow to extract the data from different DataBases with different set of parameters (for example tables and data types) so i am wondering is there any way to pass polling services just like any other parameter value as a variable..? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache NiFi
 
			
    
	
		
		
		04-14-2018
	
		
		08:14 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @shu,   I have managed to write a groovy script to extract primary keys and paritioned columns information form flow file attributes and successfully ingested the data in valid and invalid tables. Script is given below for reference.  def flowFile = session.get()   if(!flowFile) return   def fieldstructure= flowFile.getAttribute("metadata.table.feedFieldStructure")   def fieldpartition= flowFile.getAttribute("metadata.table.partitionSpecs")   def primaryKeys= ""   def partitioncolumn=""   def partitioncolumns   def partitioncolumnsline1   def partitioncolumnsline2   def partitioncolumnlist = []   def count=0   if(fieldstructure!=null)   {   def lines= fieldstructure.tokenize('\n')   for(line in lines) {                  def column= line.tokenize('|')                  if(column[2]=='1')  {
                              count=count+1  
                              if (count > 1)  {                                                primaryKeys= " and " + primaryKeys + " and " + column[0] + " is not null"                                 }   else {                                                primaryKeys= " and " + column[0] + " is not null"                                 }
               }
}
}   else{
         primaryKeys=null
}   if(fieldpartition!=null)   {   def partitoned = fieldpartition.tokenize('\n')   for(fieldpartitionline in partitoned)
{                     def partitioncolumnsline=fieldpartitionline.tokenize('|')     if(partitioncolumnsline[2].contains('('))  
  {   partitioncolumnsline1=partitioncolumnsline[2].tokenize('(')  
partitioncolumnsline2=partitioncolumnsline1[1].tokenize(')')                                       partitioncolumns = partitioncolumnsline2[0]
  }     else{
partitioncolumns = partitioncolumnsline1[2]
  }      partitioncolumnlist.add(partitioncolumns)     partitioncolumnlist=partitioncolumnlist.unique(false)   
}   for(String partition in partitioncolumnlist )   {                      if(partitioncolumnlist.size()>1)  
                        {
                        partitioncolumn= " and " + partitioncolumn + " and " + partition + " is not null" 
                         }           else { 
                               partitioncolumn=" and " + partition + " is not null"
                    }    
}
}   else
{
partitioncolumn = null
}   flowFile = session.putAttribute(flowFile,"PartitionColumns",partitioncolumn)
flowFile = session.putAttribute(flowFile,"PrimaryKey",primaryKeys)
session.transfer(flowFile, REL_SUCCESS)     
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-03-2018
	
		
		09:19 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @shu,  Thanks for your reply. I have managed to do this by using HiveQLProcessor but i have another question. Is there any processor or property available to get Primary key columns on Hive Tables. Some junk/duplicate null rows are inserted in table which are not in source file therefore i wanted to remove them before moving the data into next stage by using where primary keys are not null.  Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-01-2018
	
		
		12:49 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi, I am creating a reusable flow where i need to copy the data between hive tables although source and target table definitions will be same but for each unique execution number of columns and partition columns will be different. Is there any nifi processor available to do this.   
Thanks in advance. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache NiFi