Member since 
    
	
		
		
		09-04-2019
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                9
            
            
                Posts
            
        
                1
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		09-01-2020
	
		
		09:16 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @stevenmatison Thanks for your answer. As my tables are relatively small and only used to duplicate existing data - is there any way to remove the existing folders before importing new data?  regards 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-27-2020
	
		
		02:12 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello,     when importing data from DB2 via Sqoop to Hive the stored data in /warehouse/tablespace/managed/hive/databasename/tablename/ is steadily growing.  For every import (with --hive-import and --hive-overwrite set) there is a new folder:    "base_000000n" created. Thus the parent folder is steadily growing. Any way to delete the old folders before importing new data with Sqoop?     regards  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Hive
 - 
						
							
		
			Apache Sqoop
 - 
						
							
		
			HDFS
 
			
    
	
		
		
		07-13-2020
	
		
		12:54 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hello @VidyaSargur,     thanks for your answer. You are totally right. I only realized that this is an older thread after I had already posted.      Therefore I already created a new thread (https://community.cloudera.com/t5/Support-Questions/Permanently-store-sqoop-map-column-hive-mapping-for-DB2/td-p/299556).     regards  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-12-2020
	
		
		05:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello,  I am importing a DB2 database with Sqoop to Hive. Is there any way to permanently store mappings for specific types? I am importing several tables that contain "Character" columns which all have to be mapped manually. I want to permanently store that "Character" is stored as "Varchar".  Can this be done?     regards 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-11-2019
	
		
		05:02 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							I've tried to get the docker container to run under Windows 10 and Ubuntu-Server. In both cases I wasn't aber to reach the Ambari-Web-Interface. Since I couldn't resolve the issue I switched to the VirtualBox-VM which works pretty well. Thanks anyway!
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-05-2019
	
		
		12:35 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello, 
   
 I've installed the sandbox-hdp docker-container on my ubuntu-server using Kitematic.  The deploy process was quick and easy.   The ports 22, 4200 and 8080 are forwarded to localhost on different ports (32785-32783).    When trying to access these with my browser the page can't be reached. 
   
 Any tips? Do I need to use some other network adapter? Are other ports needed? 
   
 regards 
 muffex 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Hortonworks Data Platform (HDP)