Member since 
    
	
		
		
		05-16-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                785
            
            
                Posts
            
        
                114
            
            
                Kudos Received
            
        
                39
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2327 | 06-12-2019 09:27 AM | |
| 3578 | 05-27-2019 08:29 AM | |
| 5724 | 05-27-2018 08:49 AM | |
| 5242 | 05-05-2018 10:47 PM | |
| 3113 | 05-05-2018 07:32 AM | 
			
    
	
		
		
		04-24-2018
	
		
		04:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 1. you can go ahead create external table directly to impala , i dont see any issue in there .      2.  Use external table when multiple client tool want to have a centeralized data i, you have to decided whether your external table data  is going to be used by another external program outside hdfs for example pig etc      3. that totally depends on the future requirement but mostly people prefer  timestamps over string format when it comes to dates.      note  - loading data into Parquet tables is a memory-intensive operation , you got to keep an eye on it .  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-15-2018
	
		
		07:43 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 does your  set hive.variable.substitute=true;  script has this line inserted ?     
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-15-2018
	
		
		07:29 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Aedulla     this is the airport data sets that you can use it from the git   https://github.com/markgrover/cloudcon-hive     hope it is suffice  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-06-2018
	
		
		09:16 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 You can launch the Cluster in two ways either by firing the below command in terminal   sudo /home/cloudera/cloudera-manager --force --express  or      Using the vm desktop icon              After that Once you land in the Cluster Homepage - click on the cluster down arrow button you will find a start .  Allocate adequate amount of resource for healthy cluster.  Let me know if that helps.      Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-02-2018
	
		
		05:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 what version of kernel are you using ?   did this happens after reboot ?   do you have permission to change the value  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-01-2018
	
		
		09:06 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							    You can hit the below url for currently runining jobs for    http://spark_driver_host:4040  monitoring purpose      To see the completed jobs you can see in Spark History Web UI      Let me know if you need any more information.         Reference   https://www.cloudera.com/documentation/enterprise/5-9-x/topics/operation_spark_applications.html#concept_uhd_zpc_3w__section_bh5_xnr_yv 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-01-2018
	
		
		09:02 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Since its a generic exception , narrowing down to the issue could be little challenging   so I would start from the below parameters in the HiveServer2 to be set appropiately for the enviroment .      hive.server2.session.check.interval
hive.server2.idle.operation.timeout
hive.server2.idle.session.timeout  I believe you are runining a long runining query or job that fails with the below error ?   Could you let me know whether you have  multiple HS2 instance or one in your enviroment ?  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-24-2018
	
		
		08:46 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 do you have any files under this directory   /var/lib/cloudera-scm-headlamp/cloudera-scm-headlamp     how many cluster do you have ?  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-24-2018
	
		
		08:06 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 to add a point to @GeKas      you can see the permission denied in your log trace ( future reference ) .      AccessControlException: Permission denied: user=hive, access=EXECUTE, inode="/tmp":hdfs:supergroup:d-wx 
						
					
					... View more