Member since 
    
	
		
		
		04-03-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                164
            
            
                Posts
            
        
                8
            
            
                Kudos Received
            
        
                4
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2905 | 03-09-2021 10:47 PM | |
| 3913 | 12-10-2018 10:59 AM | |
| 6879 | 12-02-2018 08:55 PM | |
| 11445 | 11-28-2018 10:38 AM | 
			
    
	
		
		
		07-29-2022
	
		
		08:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 You can try to configure it like this inside Oozie Spark action:  ...
<configuration>
    <property>
        <name>spark.yarn.keytab</name>
        <value>path_to_keytab</value>
    </property>
    <property>
        <name>spark.yarn.principal</name>
        <value>principal@REALM.COM </value>
    </property>
</configuration>
...    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-10-2021
	
		
		02:44 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 issue resolved with your solution, thanks     CDH 6.3.3 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-12-2020
	
		
		10:24 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi to all     I'm looking for a solution to import and to partition dynamically data into ORC hive tables.     I've seent that with sqoop it is possibile di import and to partition data dynamically by the value of columns, but I've also seen that this feature is working only for the first run, because in the following job launchs I get an  'org.apache.hive.hcatalog.common.HCatException : 2002 : Partition already present with given partition key values' message since sqoop is aptempting toappend data to existing partitions.     Any idea to resolve this issue that make impossibile to use dynamic partioning in sqoop?    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-11-2020
	
		
		03:19 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi, 
   
 NOTE:- Parquet is hard coded to write the temporary data in /tmp even though the target directory is different. 
   
 Kindly check /tmp for intermediate data, you will see it there. 
   
 Regards 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-09-2020
	
		
		12:45 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I'm happy to see you resolved your issue. Please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.  
   
    
   
   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-11-2019
	
		
		05:01 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Just remove unwanted characters line end of line in shell script. It will work. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-07-2019
	
		
		10:35 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Dwill,     Is it worked for you Sqoop import with ssl enabled oracle db ? I got the same kind of requirement to use sqoop import with ssl enabled db and I am trying connecting through oracle wallet but getting network adapter issues.     Could you please provide me the steps if it is working fine for you ?     Thank you. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-28-2019
	
		
		02:16 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,     yes,configured Sqoop gateway on both the hosts.     please tell me how to run the Sqoop saved jobs in master node itself.     Thanks,  Akhila. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-25-2019
	
		
		09:11 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,     It looks like the spark job that is started is getting failed.     Can you please share the logs for the Spark job. The application ID is mentioned in the snapshot for the spark as it gets initiated but after that it fails.     ## yarn logs -applicationId <application ID> -appOwner <owner name>     Kindly copy the logs in notepad and attach to the case.     This will give more leads on this.     Regards  Nitish     Regards  Nitish 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-12-2018
	
		
		05:54 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,     This error of permission is happening only when oozie is starting the shell action by the name of yarn user and not the hbase user.     Regards  Nitish 
						
					
					... View more