Member since 
    
	
		
		
		09-25-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                72
            
            
                Posts
            
        
                61
            
            
                Kudos Received
            
        
                20
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 4963 | 03-10-2017 01:57 PM | |
| 2304 | 12-14-2016 01:22 PM | |
| 2077 | 12-12-2016 10:54 AM | |
| 4520 | 11-07-2016 04:24 PM | |
| 1088 | 09-23-2016 12:32 PM | 
			
    
	
		
		
		11-04-2016
	
		
		01:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,  Docs says:  "Shell interpreter uses Apache Commons Exec to execute external processes. In Zeppelin notebook, you can use  %sh  in the beginning of a paragraph to invoke system shell and run commands. Note : Currently each command runs as the user Zeppelin server is running as."  Is there any way we can execute the shell interpreter as the current Zeppelin user? Like introducing a special shell for Apache Commons Exec, passing a $USER parameter etc?  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Zeppelin
			
    
	
		
		
		09-23-2016
	
		
		12:32 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @Shishir Jaiswal,  Assuming HDP components on the Edge nodes such as clients and Knox then mixed versions are not supported. Knox for example only supports specific versions of an Hadoop service (see section 2.8.1.2 in http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_security/bk_security-20160829.pdf). Mixed OS versions do most times not affect things as long as it's fullfil all requirements around libraries, JVM versions etc.  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-23-2016
	
		
		12:06 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @Mohana Murali Gurunathan,  Unfortunately not, Ranger have currently plugins and support for HDFS, Hive, HBase, Kafka, Knox, YARN, Storm and Atlas.  For mor einfo please visit: http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_security/content/overview_ranger_ambari_install.html  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-23-2016
	
		
		08:51 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @rama,  You can delete the old container logs. If  yarn.nodemanager.log-dir is full no new containers will start on that node.  See also   yarn.nodemanager.disk-health-checker.min-healthy-disks   yarn.nodemanager.disk-health-checker.max-disk-utilization-per-disk-percentage  yarn.nodemanager.disk-health-checker.min-free-space-per-disk-mb  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-22-2016
	
		
		02:54 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @sivasaravanakumar k,  The rate of replication work is throttled by HDFS to not interfere with cluster traffic when failures happen during regular cluster load.  Some properties controlling this are  dfs.namenode.replication.work.multiplier.per.iteration ,  dfs.namenode.replication.max-streams  and  dfs.namenode.replication.max-streams-hard-limit . The foremost controls the rate of work to be scheduled to a DN at every heartbeat that occurs, and the other two further limit the maximum parallel threaded network transfers done by a DataNode at a time. Some description of this is available at https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-22-2016
	
		
		02:39 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @Mahesh Mallikarjunappa,  The Flume agent is typically listen to a logfile etc until it's aborted by an operator so that is where the"long-lived process" comes in.  For more info on Flume please visit - https://flume.apache.org/FlumeUserGuide.html  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-19-2016
	
		
		09:32 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @R c,  The exam is handled by PSI Services / Innovative Exams so you need to contact them. You can contact them at examsupport@examslocal.com, or call +1-888-504-9178, +1-312-612-1049.   /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-16-2016
	
		
		02:57 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 HI @Ron Buckley,  Driver above are the ones which Hortonworks distribute and support. As AIX not is binary compatible with Linux any of the above drivers will not work with AIX.   If you can get the source code you can compile it for AIX using the AIX Linux affinity libraries which according to IBM makes AIX Linux source code compatible.  I also found Download ODBC connectors which is commercial AIX Hive ODBC driver from Progress Data Direct.  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-16-2016
	
		
		10:39 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Jitendra,  Please have a look at http://hadoop.apache.org/versioning.html - It cover Ambari as well.  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-16-2016
	
		
		10:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 Hi @Sunil Mukati,  Yes it works with the "Generic" Database Type but it do not accept Netezza specific syntax. You also need to setup a DBCPConnectionPool and JDBC connection for your Netezza box.  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
         
					
				













