Member since 
    
	
		
		
		01-09-2019
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                401
            
            
                Posts
            
        
                163
            
            
                Kudos Received
            
        
                80
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2596 | 06-21-2017 03:53 PM | |
| 4294 | 03-14-2017 01:24 PM | |
| 2389 | 01-25-2017 03:36 PM | |
| 3840 | 12-20-2016 06:19 PM | |
| 2101 | 12-14-2016 05:24 PM | 
			
    
	
		
		
		05-26-2016
	
		
		12:39 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 A better option is to check why it is in safemode in the first place. Most likely situation is missing blocks and DNs down but there is a reason why it was in safemode.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2016
	
		
		12:38 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Venkadesh Sivalingam   If yarn local directory is the the one that has space issue as indicated, then its not related to yarn container logs but yarn local data. Now, This can be valid case if the job is still running. If the job is not running, there will be cases when crashed jobs can leave yarn local data. If you want to clean this up, this can stop nodemanager on that node (when no containers are running on that node) and clean up all /yarn/local directories.     On another note, there is a warning about permissions on /app-logs. Please correct the file permission (though I believe this is not causing an issue right now) 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2016
	
		
		12:28 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 How long has this been in this state? It looks like NN is up but still in safemode (it will be safemode until the threshold of blocks are reported ). You can take a look at Namenode UI (http://<NN Node>:50070) to see why it is in safemode. It will report how many blocks are reported and how many datanodes reported.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2016
	
		
		06:24 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 You are going to use hive account to run spark thrift server. So, if it is a manual install, then   ./sbin/start-thriftserver.sh --master yarn-client --executor-memory 512m --hiveconf hive.server2.thrift.port=10015  will be run as user hive (with su hive) instead of user spark in secure setup. Similarly /var/run/spark and /var/log/spark should be read/write to hive. So, just seeing contents as user hive is not enough, you need to be able to write to those folders. One good easy way is to give 77x permissions on these folders. Since spark:hadoop is owner:group and hive belongs to group hadoop, it will have write access with this setup.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2016
	
		
		05:47 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 If you are using Sandbox or HDP, try copying the jar to /usr/hdp/<version>/sqoop/lib/  Let us know once you test with jtds how jtds works.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2016
	
		
		04:44 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 You can get a sandbox from http://hortonworks.com/downloads/#sandbox  But you will need at least 8GB for the sandbox, so make sure you are on a machine that has 12-16GB RAM if you get that. If you don't have a machine with that amount of RAM, Azure/AWS is your option.   Any further questions, please open a new thread for each question, so it won't be a long thread of question and answers.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-25-2016
	
		
		10:35 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Answered your other question that asked for upgrade of sandbox 2.4.2. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-25-2016
	
		
		10:33 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 There is no upgrade steps/path for HDP sandboxes. If you want to quickly create a HDP 2.4.2 cluster and have 16GB RAM on your computer, I suggest using the steps from https://cwiki.apache.org/confluence/display/AMBARI/Quick+Start+Guide  This has steps on creating a 3 node cluster. You will need virtualbox and vagrant. While this is for Ambari development and testing, you can use the same for creating a cluster with Ambari 2.2.2 and HDP 2.4.2 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-25-2016
	
		
		10:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 livy 0.2.0 is part of HDP 2.4.2 repos. Since sandbox is on HDP 2.4.0, you won't see livy-server in the repos.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-25-2016
	
		
		09:59 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Timothy Spann Here is an example of sqooping data in compressed ORC format (without needing to use intermediate table). I see that this is not well documented.   sqoop import --connect jdbc:mysql://localhost/employees --username hive --password hive --table departments --hcatalog-database default --hcatalog-table my_table_orc --create-hcatalog-table --hcatalog-storage-stanza "stored as orcfile"  This example uses default ORC compression. If you want snappy, you can create the table in advance with property set to snappy compression and then take out '--create-hcatalog-table'  More details are in this thread.   https://community.hortonworks.com/questions/28060/can-sqoop-be-used-to-directly-import-data-into-an.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













