Member since 
    
	
		
		
		04-25-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                579
            
            
                Posts
            
        
                609
            
            
                Kudos Received
            
        
                111
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2925 | 02-12-2020 03:17 PM | |
| 2136 | 08-10-2017 09:42 AM | |
| 12471 | 07-28-2017 03:57 AM | |
| 3410 | 07-19-2017 02:43 AM | |
| 2522 | 07-13-2017 11:42 AM | 
			
    
	
		
		
		12-26-2016
	
		
		10:20 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@radhika mantri instead of reading # keys from properties file why don you put null check in your producer or consumer and set this value there like this  if (properties.getProperty("#google") == null) {
    properties.setProperty("#google", "google");
} 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-26-2016
	
		
		07:45 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Bramantya Anggriawan  ideally you should install the kafka broker on the nodes where logs should be collected, topic partition is the unit of parallelism in Kafka. On both the producer and the broker side, writes to different partitions can be done fully in parallel.if you dont have many topics on which you need to produce data to then you can have 1-2 kafka server 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-25-2016
	
		
		04:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @jiang zhixingwith the logs  Unregistering application from RM, exitStatus=SUCCEEDED, exitMessage=Session stats:submittedDAGs=0, successfulDAGs=0, failedDAGs=0, killedDAGs=0  it seems that your dag was completed successfully so need not to worry, for interrupt yarn community(https://issues.apache.org/jira/browse/YARN-1022) decided to   to change logging level for it to DEBUG but the issue is reproducible 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-25-2016
	
		
		03:50 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@Aditya Mamidala could you please share your properties file 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-25-2016
	
		
		10:56 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 the counter suggest that dag submission took 34 secs while whole execution time was 51 sec, see if there was resource issue at RM side. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-25-2016
	
		
		10:52 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Huahua Wei lsof output suggest that your zk log location is /var/log/zookeeper/zookeeper-zookeeper-server-insightcluster132.out 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-25-2016
	
		
		10:48 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @rama   unfortunatly there is no way to restore partition in hive. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-25-2016
	
		
		08:16 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Problem Description:   often we are in need to read and write underlying files from user defined reader and writer. if the the custom reader and writer are written in java or language run over JVM
then we are good to add those in hive_aux_jar or we can add them using add jar option at session level but when if it is written in native language and shipped as *.so file 
then we will get java.lang.UnsatisfiedLinkError.   we can workaround this problem after adding it to hive-env 
  1. open Ambari-->Hive-->Advanced-->Advanced hive-env-->hive-env template    2. modify   {% if sqla_db_used or lib_dir_available %}
      export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:{{jdbc_libs_dir}}"
      export JAVA_LIBRARY_PATH="$JAVA_LIBRARY_PATH:{{jdbc_libs_dir}}"
      {% endif %}
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		12-24-2016
	
		
		07:11 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Alicia Alicia ya my guess by right you are running sandbox in NAT mode, you can still access the history server web ui on the address http://localhost:18080 because port forwarding rule is configured for it, can you try this and confirm.  to access web ui on port 4040 you need to configure portforwarding rule in your virtualbox. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













