Member since 
    
	
		
		
		09-18-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                216
            
            
                Posts
            
        
                208
            
            
                Kudos Received
            
        
                49
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 1355 | 09-13-2017 06:04 AM | |
| 2606 | 06-27-2017 06:31 PM | |
| 2501 | 06-27-2017 06:27 PM | |
| 10374 | 11-04-2016 08:02 PM | |
| 9861 | 05-25-2016 03:42 PM | 
			
    
	
		
		
		12-28-2015
	
		
		09:10 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Gangadhar Kadam For each input split or file block, one map task is initiated. It doesn't depend on number of records(K, V pairs) in that block or input split. So, if you have m blocks or input splits, at least m map tasks will be initiated. It can be more than m, if you have speculative execution turned on.   w.r.t. your example, if your file of size 64MB has 1000 records and occupies one block, then only one map task would triggered. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		07:12 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Sam Mingolelli I am not sure if there was ever any single CLI command to stop entire HDP stack. But you can use Ambari APIs to start or stop all services. Refer below for the same.  https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=41812517  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		06:55 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @kkaneI just installed a couple of clusters last week and Ambari created the schema in MySQL Database I specified. So, there are two possible scenarios.  1.If this is fresh cluster install, I think there is issue with permission on database, please check the same.  2. If this is upgraded cluster, you should upgrade Hive metastore as well.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		06:37 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 So, issue was for the reason that you didn't have correct hive-site.xml path. Accepting this answer and closing the question. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		06:29 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Suresh Bonam I am not exactly sure what you meant by checking Hadoop environment in AWS. It would be same as any environment. You can log into Ambari Server UI and see if all services are up and running. Then, you can perform service check for all services from UI. Go to Ambari Server UI --> Services --> Service check.   To perform additional checks for environment, you can perform smoke tests for each of the services.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		06:23 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Peter Lasne All Service accounts i.e. hive, hdfs, mapred etc. are kind of password-less with auto generated and auto changed hash passwords i.e. managed accounts. You can login to these users from a sudo user account or as root user by doing "su - hive" or "sudo su - hive" and so on based on root or sudo user account.  On other hand, to bring up hive shell, you don't need to login as hive as @Deepesh said. You can bring up hive shell by typing hive at terminal prompt. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		06:12 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Peter Lasne As you see, only admin user has write access which is resulting into this issue. Please give write access to Hive user here and that should fix issue. Usually I would recommend having rwxrwxrwx on /tmp.  /tmp/admin/data/trucks.csv":admin:hdfs:drwxr-xr-x 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-28-2015
	
		
		02:36 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Raja Sekhar Chintalapati You need to have an account with SQL Server Authentication to sqoop from SQL Server to Hadoop. I don't think Sqoop would work with windows authentication account in SQL Server. It definitely not used to work earlier, haven't tried in past 6 months.  Please refer below.  http://hortonworks.com/hadoop-tutorial/import-microsoft-sql-server-hortonworks-sandbox-using-sqoop/  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-23-2015
	
		
		09:58 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		4 Kudos
		
	
				
		
	
		
					
							 @Suresh Bonam  There are two ways to do it which I can think of at this moment.   1. You can write a couple of Pig statements to accomplish this.  2. You can try Hive query like below: (I won't recommended it normally for performance issues as Hive will first do a full cartesian product in this query, then filter, but since one side of the join only has one row, that's not an issue here.)  select emp1.ename, emp1.hiredate from emp emp1 join 
(select hiredate from emp where emp.ename='KING') emp2
where emp1.hiredate > emp2.hiredate; 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-23-2015
	
		
		09:43 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Ali Bajwa As discussed, We have noticed issues when open jdk 1.7.0 is used with Ambari Server resulted in the ssl issue. Please refer below for the same.   https://community.hortonworks.com/questions/145/op...  Check for the value in ambari.properties file:  java.home=/usr/jdk64/jdk1.8.0_40 (replace with Oracle JDK version setup for your environment)  jdk.name=jdk-8u40-linux-x64.tar.gz (replace with Oracle JDK version setup for your environment)  If there are multiple jdks installed try switching the jdk.  ambari-server setup –j <jdk path>  http://docs.hortonworks.com/HDPDocuments/Ambari-2....  
						
					
					... View more