Member since 
    
	
		
		
		09-25-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                72
            
            
                Posts
            
        
                61
            
            
                Kudos Received
            
        
                20
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 4969 | 03-10-2017 01:57 PM | |
| 2319 | 12-14-2016 01:22 PM | |
| 2077 | 12-12-2016 10:54 AM | |
| 4531 | 11-07-2016 04:24 PM | |
| 1089 | 09-23-2016 12:32 PM | 
			
    
	
		
		
		03-10-2017
	
		
		01:57 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @rahul gulati,  Atlas GUI works for Chrome and Firefox. I'm using HDP 2.5.3 and the latest Chrome version without any problems.  hdfs_path is an Atlas metadata type but unfortunately it's no HDFS hook available in 2.5 which automatically populates the hdfs_path so you probably do not have any hdfs entity data in Atlas. HDP 2.6 will include some functionality for HDFS entity creation.  You can always build your own hook - a good doc to understand the Atlas model  is http://atlas.incubator.apache.org/0.7.1-incubating/AtlasTechnicalUserGuide.pdf.  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-10-2017
	
		
		01:07 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 HI @Mehul Shah,  You probably missing Java JCE support or UNlimited Strength Chryptographic plocies (USC) or have some mismatch with your keys for OpenSSL. Please visit  http://docs.hortonworks.com/HDPDocuments/HDF2/HDF-2.1.2/bk_dataflow-ambari-installation/content/distribute_and_install_the_jce.html (This is for HDF 2.1 installed with Ambari but same doc is available for other install options).  There are also some info around PKCS12/OpenSSL in https://issues.apache.org/jira/browse/NIFI-3062.  /Best regards, Mats  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-27-2016
	
		
		01:52 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Hi @vamsi valiveti,  Oozie is a scheduler and Flume is not working on a schedule basis instead Flume is treating the data when it receives it. So you use teh Flume configuration to tell for example that each time there is a file in a certain directory Flume will put it in hdfs (if you use the spooldir source) and so on.  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-14-2016
	
		
		01:22 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @Roger Young,  Haven't played with Minifi but with Nifi in general source origin is a Flow File attribute which you can find under the Attributes tab when viewing a Flow File under the data provenance menu.  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-14-2016
	
		
		12:37 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @Arsalan Siddiqi,  I recommend you to use Ambari as it will give you access to all configuration parameters and also to data files in HDFS by using the Ambari File View.  Please be aware that if you are using Ambari,  any changes to config needs to be done through the Ambari GUI or API and editing the config files directly will not work (they are just a copy of the Ambari config).  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-12-2016
	
		
		03:50 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @anand maurya,  Please have a look at the Hortonworks-Securosis White Paper regarding Security in Hadoop. The paper can be found at: http://hortonworks.com/info/securing-hadoop/  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-12-2016
	
		
		11:17 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @subash sharma,  Make sure your Java environment is configured properly - for example JAVA_HOME.   Please have a look at https://community.hortonworks.com/questions/39839/how-to-import-metadata-from-hive-into-atlas-and-th.html  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-12-2016
	
		
		10:54 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @Avijeet Dash,  The Solr index requires persistent storage as well.   There are several options to read Hbase from Hive and Solr from Hive and they all include storage handlers and SerDes such as https://github.com/lucidworks/hive-solr and https://github.com/chimpler/hive-solr.  Also for Hive/Hbase integration there is https://cwiki.apache.org/confluence/display/Hive/StorageHandlers  Hope this helps.  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-07-2016
	
		
		04:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Hi @Edgar Daeds,  Hive Metsatore is designed for Innodb and do not support MyISAM or ndbcluster as database type in MySQL. You can still replicate your Metastore by using MySQL replication of the binlogs. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-07-2016
	
		
		04:01 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @vshukla,  How do I enable impersonation for %sh?   I can't find it in the Zeppelin GUI or in the docs?  /Best regards, Mats 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
         
					
				













