Member since 
    
	
		
		
		06-26-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                515
            
            
                Posts
            
        
                137
            
            
                Kudos Received
            
        
                114
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2144 | 09-20-2022 03:33 PM | |
| 5840 | 09-19-2022 04:47 PM | |
| 3133 | 09-11-2022 05:01 PM | |
| 3521 | 09-06-2022 02:23 PM | |
| 5560 | 09-06-2022 04:30 AM | 
			
    
	
		
		
		09-11-2022
	
		
		05:01 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @sekhar1 ,     The CDP user that you're using to execute your job needs an "IDBroker mapping" to a valid AWS role to be able to access the contents of the S3 bucket.     Please check this: https://docs.cloudera.com/cdf-datahub/7.2.10/nifi-hive-ingest/topics/cdf-datahub-hive-ingest-idbroker-mapping.html     Cheers,  André    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-07-2022
	
		
		05:04 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Everything you do in the NiFi UI can also be done using the NiFi REST API. So if you want/need to automate it, it's totally possible and not difficult.     Cheers,  André 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-07-2022
	
		
		03:57 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @SandyClouds ,     Have a look at Parameters and Parameters Contexts: https://nifi.apache.org/docs/nifi-docs/html/user-guide.html#Parameters     Cheers  André 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-06-2022
	
		
		03:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @rafy ,     Same approach. Just create the processor(s) and let the files flow through it 🙂     Cheers,  André    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-06-2022
	
		
		02:23 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @rafy      You shouldn't need to extract the content as an attribute. Instead, use ReplaceText to replace the contents of the flowfile with the SQL template, like this:  INSERT INTO my_table (xml_col) VALUES ('$1')     Where $1 is a reference to the default regex capture group that captures the entire content.  In reality, it requires a bit more labour, since you have to escape the single quotes in the XML first.     Cheers,  André    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-06-2022
	
		
		02:17 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @VenkatG ,     What are you trying to use this resulting JSON for? That seems a bit odd of a format to me.  Nevertheless, here's a JOLT to achieve that (or close):  [
  {
    "operation": "default",
    "spec": {
      "__timestamp": "${now()}"
    }
  },
  {
    "operation": "shift",
    "spec": {
      "custId": "Rows[0].values[0]",
      "name": "Rows[0].values[1]",
      "Address2": "Rows[0].values[2]",
      "Address1": "Rows[0].values[3]",
      "zip": "Rows[0].values[4]",
      "__timestamp": "Rows[0].values[5]"
    }
  },
  {
    "operation": "default",
    "spec": {
      "operationType": "Insert",
      "Source": "Dev"
    }
  }
]     Cheers,  André    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-06-2022
	
		
		04:30 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi, @code ,     I don't think this is actually possible. Even if there was a way to enter a literal value of NULL for the lookup value, the controller service is probably comparing the lookup value with an "equals" operation (e.g. mytable.mykey = lookup_value) and in relational databases the comparison NULL = NULL is always evaluated to FALSE. (the only way to compare values with a NULL is to use the operator IS).     What you can try to do is to create a view on top of that table that converts NULLs to some string that you can use to compare to lookup values in NiFi. Then you can use the view name in the controller service configuration instead of the table name.     For example:        CREATE VIEW v_mytable AS
SELECT
  NVL(mykey, '<NULL>') as key_without_nulls, *
FROM mytable        Be aware of potential performance implications of this, since this could prevent existing table indexes from being used for lookups.     Cheers,  André       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-06-2022
	
		
		04:16 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Markyd ,     Functions used in Hive queries are usually passed to the Hive service as is. I don't think the ODBC driver does any particular processing or validation of those functions. At the end of the day it's a matter of whether the Hive version you're using supports that function or not, and this depends on the Hive version being used, not the driver version.     Here you can see a comprehensive list of Hive functions and the associated Hive version where they were introduced: https://cwiki.apache.org/confluence/display/hive/languagemanual+udf     Which version of Hive are you using?  Are you using the Cloudera ODBC Drivers for Hive? Which version?     Cheers,  André    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-05-2022
	
		
		03:54 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Ahmed_Abuhaimed ,     Have you looked into change data capture functionality for Hana? https://help.sap.com/docs/SAP_DATA_INTELLIGENCE/1c1341f6911f4da5a35b191b40b426c8/023c75aedfdd4646934f2d9ccde5660a.html     Cheers,  André    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-05-2022
	
		
		03:48 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @samrathal ,     Is seems that "subtract" only works for integers and your value is a long. The longSubtract function only works if longs are passed as parameters and I don't know if there's a way to specify a long literal in Jolt (I tried 60L but that doesn't work).     The following does a little bit more work but also achieves what you want:     [
  {
    "operation": "modify-overwrite-beta",
    "spec": {
      "currenttime": "=toLong(${now():toNumber()})",
      "minute": "=toLong(60)"
    }
  },
  {
    "operation": "modify-overwrite-beta",
    "spec": {
      "timeOneMinu": "=longSubtract(@(1,currenttime), @(1,minute))"
    }
  },
  {
    "operation": "remove",
    "spec": {
      "minute": ""
    }
  }
]        Output:  {
	"currenttime": 1662417935173,
	"timeOneMinu": 1662417935113
}  Cheers,  André 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













