Member since 
    
	
		
		
		05-16-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                785
            
            
                Posts
            
        
                114
            
            
                Kudos Received
            
        
                39
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2324 | 06-12-2019 09:27 AM | |
| 3568 | 05-27-2019 08:29 AM | |
| 5718 | 05-27-2018 08:49 AM | |
| 5231 | 05-05-2018 10:47 PM | |
| 3110 | 05-05-2018 07:32 AM | 
			
    
	
		
		
		09-03-2019
	
		
		08:16 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Could you share the error ?   Do you have sqoop client being installed on the node ?   whats you mysql cnf file looking   if you have this     skip-networking  just comment it out and restart the mysql  i assume it your poc box.   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-03-2019
	
		
		11:09 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,     To view the spark logs of an completed application you can view the logs by running the below command  yarn logs -applicationId application_xxxxxxxxxxxxx_yyyyyy -appOwner <userowner> > application_xxxxxxxxxxxxx_yyyyyy.log     Thanks  AKR 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-12-2019
	
		
		11:21 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							Hi,    Couple of questions:    1. Have you checked HS2 log and see if it complained anything or did beeline reach HS2 at all? I suspect not, but just want to be sure.    2. Based on the code here:  https://github.com/cloudera/hive/blob/cdh6.1.0/beeline/src/java/org/apache/hive/beeline/BeeLine.java#L1035-L1048    It looks like that beeline failed to get the connection string. Have you tried to quote the connection string just in case?    beeline -u 'jdbc:hive2://hostname.domain.dom:10000'    Cheers  Eric
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-12-2019
	
		
		11:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 sudo cp /home/cloudera/Downloads/java-json.jar /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/      Above solution worked for me too. Thank you 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-06-2019
	
		
		11:09 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							           I added above values and that was causing https to shutdown. After deleting those values , it started and working fine now.       Thanks @Harsh J  for your reply. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-05-2019
	
		
		12:15 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Could you let me know what is the issue you are facing ? whats the error ?  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-20-2019
	
		
		09:04 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 This is quite common issue while installing CDH.However it is so easy to resolve it.     This is the issue with ssh.  You need to edit the  /etc/ssh/sshd_config   file by changing below parameter :-     PermitRootLogin yes     Now restart ssh service : sudo service ssh restart     Now Select root to login as below            click on Continue     Your issue will be resolved.     Thanks,  Solomonchinni 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-15-2019
	
		
		07:25 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @GaryS I reached out to the sales team yesterday and someone should be contacting you. Let me know if you haven't heard anything by tomorrow.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-10-2019
	
		
		02:02 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 By default, in Hive, Parquet files are not written with compression enabled.     https://issues.apache.org/jira/browse/HIVE-11912     However, writing files with Impala into a Parquet table will create files with internal Snappy compression (by default). 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













