Member since 
    
	
		
		
		08-06-2019
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                2
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		10-08-2019
	
		
		01:36 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thank you for the response. This was the correct answer, but I was unable to verify until recently.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-06-2019
	
		
		07:52 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I have JSON input of the following format:      {
  "Id": 1000000,
  "ReportName": TestReport,
  "Results": [{
    "Id": 1,
    "ResultId": "1000000-0",
    "Query": {
      "Id": 001,
      "Name": "TestQuery0",
    }
  }, {
    "Id": 2,
    "ResultId": "1000000-1",
    "Query": {
      "Id": 002,
      "Name": "TestQuery1",
    }
  }]
}  These file can become quite large depending on the number of Results in the Report and I was hoping to convert the single Flowfile to multiple records for processing. However, due to the format of the JSON a SplitRecord will result in one record per split. There is one report per FlowFile and therefore only 1 root level element.  I am looking for a method or strategy to split the Flowfile into smaller Records while still maintaining the cohesiveness of the report in the end when it put in HDFS.   Current Strategy:   Use JoltTransformJSON to inject report information into each result  Use SplitRecord to split the Flowfile on each result  Process Records  Use MergeRecord to get the modified Flowfile in Step1  Convert to original Flowfile (Not sure of the best method here)  Use MergeContent and push to HDFS   Any advice would be appreciated! Thank you 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache NiFi