Member since 
    
	
		
		
		04-12-2019
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                6
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		05-17-2019
	
		
		11:33 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 HI @dbompart   Yes the logic you mentioned is perfectly      I have some more clarification regarding containers on Map Reduce and Spark     In Map Reduce running sqoop Import   In Spark running PySpark shell on top of yarn      Now the configuration :  MapReduce:-  yarn.scheduler.maximum-allocation-MB  :- 36864 * 2 = 73728      But my concern is now how can i limit the Running containers per user basics (I cant set Different queues in capacitor scheduler as mentioned above)      -> When ever i am running spark application is also running on top of yarn       Running Containers :-   3   Allocated CPU's       :-   3      Total Memory allocated   :-  5120          Will you help me the logic what is happening behind these      Thanks a lot     
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-08-2019
	
		
		10:02 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Is their any way to control memory utilization exceed than maximum memory allocated in the Yarn resource manager.      My configuration placed in yarn is :-  yarn.scheduler.minimum-allocation-mb = 1024  yarn.scheduler.maximum-allocation-mb = 4096      yarn.schedular.minimum-allocation-vcores = 3  yarn.schedular.maximum-allocation-vcores = 3      Error:-  Now the point however it is ignoring the static configuration which placed in the yarn.       It is picking more memory When i observed in Yarn-UI (I am seeing these configuration)   running containers to             72  Allocated Cpu v-cores to       72  Allocated memory MB to       120034      Please help me how to set to the maximum level which placed in yarn configuration      Thanks a lot         
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		04-30-2019
	
		
		05:47 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Geoffrey Shelton Okot Thanks a lot helping in this 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-12-2019
	
		
		02:09 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Will you please please help me to configure new database.      By default the spark-sql is connecting to the hive, so i want to configure new database like postgreSQL instead of hive      it will be very help full to rectify for me 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Hortonworks Data Platform (HDP)