Member since 
    
	
		
		
		12-14-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                58
            
            
                Posts
            
        
                1
            
            
                Kudos Received
            
        
                5
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 1874 | 04-19-2017 05:49 PM | |
| 1566 | 04-19-2017 11:43 AM | |
| 2426 | 04-19-2017 09:07 AM | |
| 4187 | 03-26-2017 04:20 PM | |
| 5784 | 02-03-2017 04:44 AM | 
			
    
	
		
		
		02-18-2021
	
		
		07:09 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @ramcharantej ,     Hadoop needs file read permission to verify login information of a user.     Try to change the file permissions to hive user with hadoop group and it should work after.     You can use below command for user credential files.     sudo chgrp  hive:hadoop  /etc/shadow     In your case, additional files like sshd files should also be owned by root.     Please let me know if you still face any problem with it.     Regards,  Dhirendra        
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-25-2017
	
		
		03:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Thanks all for your responses. Once again i reassign ownership. It works!!!  ## hdfs dfs -chown -R admin:hadoop /user/admin 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-09-2017
	
		
		10:55 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Also, this instance is not present in all Regions. Change the Region and give a try. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-11-2018
	
		
		05:52 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 
	In case that someone will face the same problem we solved this by making the table internal, keeping the TextFile format and storing data under default Hive directory. The definition of table look like this at the moment:  create table test1(c1 int, c2 int)  CLUSTERED BY(c1) SORTED BY(c1) INTO 4 BUCKETS ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE; 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-13-2017
	
		
		03:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Michael Young   Thanks ! That worked like a charm. I still have no idea why it doesn't let me upload using the HDFS UI so if you know why then I would love to know.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-26-2017
	
		
		04:20 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks for the reply folks. I have found the issue ! When we are importing the data from legacy DB servers using Spark, during the Spark execution, Hive staging files are created in target location where data resides. When we export these data to S3 using disctp, these hive staging also moves to that bucket. So when we query these using hive, it seems to be checking all those hive staging files before throwing the o/p and also number of splits matters which are more in number, I have merged these splits together to have less mappers and to get better performance which is achieved now. I get the count of the 3 million records table in fraction of seconds!  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-20-2017
	
		
		10:24 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks Namit, this worked for me on my Dev Environment. Will try on next Change on Prod too. Thanks. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-27-2017
	
		
		11:12 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks Sanjeev,
we have Symlinked hdp/ under usr directory to /opt/usr/hdp earlier successfully !! We are going good now !  Cheers,  Ram 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-05-2018
	
		
		04:18 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks for the feedback. ok good dich vu seo 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-26-2017
	
		
		06:47 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks for your information.  I think   virtualenv venv. ./venv/bin/activate    should be   virtualenv venv
. ./venv/bin/activate   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













