Member since 
    
	
		
		
		07-31-2019
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                346
            
            
                Posts
            
        
                259
            
            
                Kudos Received
            
        
                62
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 3896 | 08-22-2018 06:02 PM | |
| 2229 | 03-26-2018 11:48 AM | |
| 5263 | 03-15-2018 01:25 PM | |
| 5632 | 03-01-2018 08:13 PM | |
| 1870 | 02-20-2018 01:05 PM | 
			
    
	
		
		
		01-06-2016
	
		
		02:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 Atlas only supports the sqoop bridge for hiveimport. http://atlas.incubator.apache.org/Bridge-Sqoop.html. Currently I don't see any Pig integration. The metadata is stored in a Titan\HBase repository. This is a graph database. Here is the link to the architecture: http://atlas.incubator.apache.org/Architecture.html. I have not heard recently of 3rd party integration with Atlas but I suspect its on the roadmap.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-30-2015
	
		
		08:11 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @R M. Instead of sqoop you may want to try using the native Teradata Hadoop connector http://downloads.teradata.com/download/connectivity/teradata-connector-for-hadoop-command-line-edition.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-29-2015
	
		
		02:55 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Try echo $HIVE_CONF_DIR and view the output. It shoudl be /etc/hive/conf 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		09:14 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Sam Mingolelli Agree, but its only because those are the set minimum TTL requirements. Change the TTL values and you should be able to get by with less space. It took 2 months to fill up for you. I've seen it fill up in days with multi-node clusters. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		06:53 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 @Sam Mingolelli Which version of Amabri are you using? Ambari 2.1 does allow you to truncate. It would be easier to remove AMS and reinstall. We also recommend a dedicated minimum of 10 GB for AMS. See: https://cwiki.apache.org/confluence/display/AMBAR...  You may also want to edit your TTL settings - https://cwiki.apache.org/confluence/display/AMBARI/Known+Issues and here - https://cwiki.apache.org/confluence/display/AMBARI/Configuration 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		02:45 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Pardeep is correct. You can only use a local SQL Server account for a Sqoop import.  You may want to secure the connection by using the "password-file" hint. https://sqoop.apache.org/docs/1.4.5/SqoopUserGuide.html#_connecting_to_a_database_server 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-16-2015
	
		
		07:01 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Mamta Chawla will Hive's CTE syntax help in your case? https://cwiki.apache.org/confluence/display/Hive/C... 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-15-2015
	
		
		02:07 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Notifying @Alan Gates 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-15-2015
	
		
		01:45 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @PRADEEP /usr/hdp/current contains symlinks to versioned directories      As always, verify correct permissions exist on the directories.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-15-2015
	
		
		01:26 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 You can also try using the browser based shell connection: http://sandbox.hortonworks.com:4200/ 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













