Member since 
    
	
		
		
		05-02-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                88
            
            
                Posts
            
        
                173
            
            
                Kudos Received
            
        
                15
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 7407 | 09-27-2017 04:21 PM | |
| 3393 | 08-17-2017 06:20 PM | |
| 3083 | 08-17-2017 05:18 PM | |
| 3665 | 08-11-2017 04:12 PM | |
| 5154 | 08-08-2017 12:43 AM | 
			
    
	
		
		
		05-26-2017
	
		
		09:36 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		4 Kudos
		
	
				
		
	
		
					
							@Hugo Felix Is your jobhistory server running in ambari or not?  Please check and confirm. Also share your job.properties file contents. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2017
	
		
		09:25 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							@Sadegh Can you try to import it into the virtualbox(.ova file format). 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2017
	
		
		07:15 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Sadegh  You have installed the sandbox on windows 10 machine. Its not like that you have installed Hadoop onto your system. thats why whenever you want execute hadoop commands you will need to SSH from putty itself and then you can execute the hadoop command into that shell. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2017
	
		
		06:29 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Jay SenSharma  I am looking for a Use case on this. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2017
	
		
		05:21 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 I have Storm service and added two nimbus hosts. how to verify nimbus high vailabilty ? What are the steps to be performed. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Storm
 
			
    
	
		
		
		05-26-2017
	
		
		04:22 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Azad Kumar   Should I execute this command in Sandbox terminal? yes  Where should I upload file test.sh? /tmp folder  how I will copy test.sh from local machine?  hadoop fs -copyFromLocal <local path of test.sh file> /tmp/ 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2017
	
		
		03:09 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		4 Kudos
		
	
				
		
	
		
					
							 @Stinger  You can check all query logs of Hive session from log file under /tmp as,  /tmp/<Username>/hive.log  Please check. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2017
	
		
		03:00 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		4 Kudos
		
	
				
		
	
		
					
							 @PJ   Yes off course, you can go to RM's and delete the hostname from the yarn.exclude file. After that you don't need to restart the YARN service after that just execute below command,  yarn rmadmin -refreshNodes  check with this. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2017
	
		
		02:52 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							@Azad Kumar There are 3 ways.  1. hadoop fs -cat /tmp/test.sh|exec sh  2. You can install HDP NFS and mount the hdfs directory on local file system from where you can execute your script.  https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_hdfs_nfs_gateway/content/user-guide-hdfs-nfs-instructions.html  3. You can write an oozie shell workflow and call your .sh HDFS file inside the workflow.  http://rogerhosto.com/apache-oozie-shell-script-example/ 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2017
	
		
		02:49 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		5 Kudos
		
	
				
		
	
		
					
							@punit Please use below URL for JDBC connection for Hive,  jdbc:hive2://sandbox.hortonworks.com:10000/default  or  jdbc:hive2://sandbox.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2 
						
					
					... View more