Member since 
    
	
		
		
		09-27-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                138
            
            
                Posts
            
        
                23
            
            
                Kudos Received
            
        
                10
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 11953 | 02-28-2021 10:23 PM | |
| 2508 | 02-08-2021 11:53 PM | |
| 36631 | 12-16-2020 11:31 PM | |
| 9245 | 12-14-2020 11:02 PM | |
| 6409 | 12-14-2020 12:18 AM | 
			
    
	
		
		
		12-15-2020
	
		
		11:31 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @stephane_davy      Now I came one step further. I defined this schema for the JsonTreeReader in JoltTransformRecord.     {
  "name": "HCC_JOLTTRANSFORMRECORD_IN",
  "type": "record",
  "namespace": "HCC_JOLTTRANSFORMRECORD_IN",
  "fields": [
    {
      "name": "myJSON",
      "type": {
        "type": "array",
        "items": {
          "name": "myJSON_record",
          "type": "record",
          "fields": [
            {
              "name": "myfield",
              "type": "string"
            },
            {
              "name": "myfield1",
              "type": "string"
            },
            {
              "name": "myfield2",
              "type": "string"
            }
          ]
        }
      }
    }
  ]
}     So the error   Error transforming the first record:   is gone!        Now get another error concerning the Writer-schema:     JoltTransformRecord[id=65b2b5fd-0176-1000-ffff-ffffd0f23bd9] Unable to transform StandardFlowFileRecord[uuid=7e4fe006-1eb0-44cd-9e16-f4c8a8c533df,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1608074221236-4, container=default, section=4], offset=584493, length=780],offset=0,name=d92eab41-fa79-441f-a5d9-c6e7f6be10c0,size=780] due to org.apache.nifi.serialization.record.util.IllegalTypeConversionException: Cannot convert value [[Ljava.lang.Object;@1b321d9e] of type class [Ljava.lang.Object; to Record for field r: Cannot convert value [[Ljava.lang.Object;@1b321d9e] of type class [Ljava.lang.Object; to Record for field r     Working on it... 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-15-2020
	
		
		10:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @stephane_davy   I assume that this question follows the previous one  https://community.cloudera.com/t5/Support-Questions/Nifi-Multiple-predicate-in-recordpath-filter/m-p/307645#M223278     So I tried with my test-JSON to do the same like you.  {
   "myJSON": [
      {
         "myfield": "JustForHavingJson",
         "myfield1": "A",
         "myfield2": "C"
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "B",
         "myfield2": "C"
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "C",
         "myfield2": ""
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "E",
         "myfield2": ""
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "X",
         "myfield2": ""
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "",
         "myfield2": ""
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "D",
         "myfield2": "G"
      }
   ]
}    But neither without nor with schema the JoltTransformRecord works. Getting the same error as you (NiFi 1.11.1).     One possibility and quick solution I found is to:   EvaluateJsonPath --> setting the flowfile-content to an attribute  UpdateAttribute --> ${FF_CONTENT:jsonPath('$.myJSON')}  ReplaceText --> bring the attribute back to flowfile-content   If you get the JoltTransformRecord work I would like to know how. Thanks. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-14-2020
	
		
		11:02 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @adhishankarit      Sadly I have no experience doing such kind of logging in own files to a NFS.     But what about logging to nifi-app.log? I don't understand the problem you mentioned in your first post concerning this.  Using the LogAttribute-processor gives you some options (attributes, content) what and how to log there. With the option "Log prefix" you can evaluate data by this value later.  IMHO Logging all attributes would be advantage because not all processors deliver the same attributes after they ran.     Maybe you can do something with scripts like mentioned here:  https://community.cloudera.com/t5/Support-Questions/Is-possible-to-write-an-attribute-into-a-file-and-also-keep/td-p/184414     Sorry but im neither fit with scripting nor with this kind of logging so I'm not able to help you further. But maybe some other guys here will help you. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-14-2020
	
		
		08:21 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @opalo54   Thanks for posting your solution!  I was just trying too but I think your input data has to be set to an array for this. Am I getting right? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-14-2020
	
		
		03:53 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @jainN      Maybe you can describe your problem a bit more?  So it maybe will be possible to give you some help.     f.e.  Does all files (with different formats) come from the same processor?  What is your "previous processor"?  Do you get data from InvokeHTTP?  "When we get the json file" - Do you get responses without content? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-14-2020
	
		
		03:46 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @adhishankarit      I don't know wether I get your requirement right.  You want to log EACH single NiFi-processer regardless wether it succeeds successful or not (like error, retry, etc.). Right?     Evaluation of data of nifi-app.log (haven't done this by now) or by REST-API (I suppose something in the area "Flow" https://nifi.apache.org/docs/nifi-docs/rest-api/index.html) might be possible.     But why don't you log this data direct from the flow by writing to a database logtable after each processor? Concerns about performance? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-14-2020
	
		
		12:18 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @stephane_davy   Sorry I couldn't work on this further last week. Now my solution looks like this.      JSON from GenerateFlowFile   [
      {
         "myfield": "JustForHavingJson",
         "myfield1": "A"
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "B"
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "C"
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "D"
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "E"
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": "X"
      },
      {
         "myfield": "JustForHavingJson",
         "myfield1": ""
      }
   ]      Definitions of ControllerServices and AvroSchemaRegistry    ControllerServices and AvroSchemaRegistry  Flow    Flow       Details of RecordProcessing-Processors    Details RecordProcessing-Processors       Result FF-Content after MergeRecord:   {"myfield":"JustForHavingJson","myfield1":"A","myfield2":"C"}
{"myfield":"JustForHavingJson","myfield1":"B","myfield2":"C"}
{"myfield":"JustForHavingJson","myfield1":"C","myfield2":""}
{"myfield":"JustForHavingJson","myfield1":"E","myfield2":""}
{"myfield":"JustForHavingJson","myfield1":"X","myfield2":""}
{"myfield":"JustForHavingJson","myfield1":"","myfield2":""}
{"myfield":"JustForHavingJson","myfield1":"D","myfield2":"G"}    Personally I wouldn't do a MergeRecord at the end but go on with the three single connections of the UpdateRecord processors.     Do you think this could be a possible solution or have you found a better way to do this? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-13-2020
	
		
		10:58 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @GMAN   You can do this with UpdateAttribute.  In my example:  Date_Time ==> ${YourUnixTimestamp:toDate():format('YYYY-MM-dd hh:mm:ss.SSS', 'UTC')}  YourDate_YYYY-MM-dd ==>  ${YourUnixTimestamp:toDate():format('YYYY-MM-dd')}         You can check the result here: https://www.unixtimestamp.com/index.php  But you have to remove the last three digits of the value.    Hope this helps! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-09-2020
	
		
		04:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @stephane_davy   I'm working in the same direction because I couldn't believe that there is no possibility.         But I'm still struggling with problems of the correct MergeRecord.  So we both go the same way. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-07-2020
	
		
		11:54 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello@stephane_davy      I'm sorry, I haven't seen that record-processing is a "must".    Because I haven't much experience with this kind of syntax I sadly will not be able to help you here.    But if you find a solution just be so kind and show it all of us here. Thank you! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













