Member since 
    
	
		
		
		04-12-2019
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                105
            
            
                Posts
            
        
                3
            
            
                Kudos Received
            
        
                7
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 4312 | 05-28-2019 07:41 AM | |
| 2936 | 05-28-2019 06:49 AM | |
| 2311 | 12-20-2018 10:54 AM | |
| 1600 | 06-27-2018 09:05 AM | |
| 8302 | 06-27-2018 09:02 AM | 
			
    
	
		
		
		06-04-2018
	
		
		05:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@Felix Albani I agree with 2nd and 3rd answer.  If we use user@AD.REALM for access kerberized service on cluster, how we define service access to user@AD.REALM?  As i know, We don't need to create any service principal at AD server. Just we have to create trust with AD servers.  Can you please help me to understand the concept?  Regards,  Vinay 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-04-2018
	
		
		09:58 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Folks,  I have configured MIT kdc which is integrated with Active directory according to referred link:  https://community.hortonworks.com/articles/59635/one-way-trust-mit-kdc-to-active-directory.html  My question is:  1. How can i test one way trust is successfully created or not?  2. Users will persist on AD server and services will persist on hadoop cluster. Should i have to create user principal in kerberos database?  3. If yes, Should be have to add principal in kerberos manually whenever new user created in AD server?  Regards,  Vinay 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Hadoop
 - 
						
							
		
			Kerberos
 
			
    
	
		
		
		05-29-2018
	
		
		08:22 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks Aditya..  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-28-2018
	
		
		09:49 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Folks,  Hope all are doing well.  I'm newer in spark. I have installed HDP 2.6.2. i have added spark as a service. Before start the spark, There was no job running. But when i had started spark service, i have found two jobs are running continuously in UNDEFINED state.  #yarn application -list   18/05/28 15:07:51 INFO client.AHSProxy: Connecting to Application History server at 10.10.10.16:10200   18/05/28 15:07:51 INFO client.RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]...   18/05/28 15:07:51 INFO client.RequestHedgingRMFailoverProxyProvider: Found active RM [rm2]   Total number of applications (application-types: [] and states: [SUBMITTED, ACCEPTED, RUNNING]):4                   Application-Id      Application-Name        Application-Type          User           Queue                   State             Final-State             Progress                        Tracking-URL
application_1527494556086_0039  org.apache.spark.sql.hive.thriftserver.HiveThriftServer2                       SPARK          hive           admin                 RUNNING               UNDEFINED                  10%              http://10.10.10.8:4040
application_1527494556086_0038  org.apache.spark.sql.hive.thriftserver.HiveThriftServer2                       SPARK          hive           admin                 RUNNING               UNDEFINED                  10%             http://10.10.10.12:4040  Thrift server is installed on both Server having IP are:  10.10.10.8  10.10.10.12  Can you please help me to clearify? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Hive
 - 
						
							
		
			Apache Spark
 - 
						
							
		
			Apache YARN
 
			
    
	
		
		
		05-21-2018
	
		
		05:46 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Folks,  Hope all are doing well.  I've dev setup of 11 node(2x NN, 8x DN, 1x edge node). Basically All nodes are connected by Cluster private(only 11 machine can access to each other) networking. We are using floating IP(public in only organization) on data node as well as on Edge node for access the data from DB.   If we keep floating ip only on Edge node and run MapReduce job from edge node which is importing data from DB to HDFS, i'm getting error that DataNode IP is not able to access to source DB.  Can someone suggest does we require Floating IP on all DataNode machine? Or we can use any other solution?  Will be very thankful to you.  Regards,  Vinay K 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Hadoop
 
			
    
	
		
		
		05-09-2018
	
		
		05:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Rajendra Manjunath  i have installed HDF 3.0.3, which has installed properly using CLI.  Now i'm stuck on install on.. in ambari. In HDF cluster name is none. while i click on Install on..  then click on Test_cluster, we directly redirected to HDP installed component. HDF is not available in installed component.      
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-08-2018
	
		
		09:25 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Rajendra Manjunath   I'm using Ambari  2.5.2 version.  And im using mpack. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-08-2018
	
		
		08:39 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Rajendra Manjunath  While installing HDF in HDP, i got below error:  Caused by: org.xml.sax.SAXParseException; systemId: file:/var/lib/ambari-server/resources/stacks/HDF/3.1/upgrades/nonrolling-upgrade-3.1.xml; lineNumber: 213; columnNumber: 94; cvc-complex-type.3.2.2: Attribute 'supports-patch' is not allowed to appear in element 'task' 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-08-2018
	
		
		06:01 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 we are having HDP 2.6.2 running properly. We are looking to install nifi in our environment.   Is it recommended to install HDF on existing HDP?  Have there any HDF version compatibility with HDP 2.6.2? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache NiFi
 - 
						
							
		
			Cloudera DataFlow (CDF)