Member since 
    
	
		
		
		03-04-2019
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                59
            
            
                Posts
            
        
                24
            
            
                Kudos Received
            
        
                5
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 6297 | 07-26-2018 08:10 PM | |
| 8177 | 07-24-2018 09:49 PM | |
| 3922 | 10-08-2017 08:00 PM | |
| 3209 | 07-31-2017 03:17 PM | |
| 1123 | 12-05-2016 11:24 PM | 
			
    
	
		
		
		06-15-2017
	
		
		09:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 In the latest HDP 2.6.x, Oozie works with either Spark 1 or Spark 2 - it's not side-by-side deployments.    You can follow these instructions to have Oozie work with different versions of Spark. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-15-2017
	
		
		09:23 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Assuming all the other services are working properly except for Hive LLAP.  You can try to 'Regenerate Keytabs' in Ambari.  While doing that, you need to check 'Only regenerate keytabs for missing hosts and components' option.    If that doesn't work, you can regenerate all the keytabs for all hosts which requires all components to be restarted.  Also, if you are using a test AD KDC, you can try to restart it by following these instructions.  Hope that helps. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-07-2017
	
		
		06:13 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Is it failing for this particular Spark job or all the Oozie jobs?  As Mark pointed out, setting 'mapreduce.framework.name=yarn' should be fine.  Also in case you have enabled Yarn RM HA, you should use jobTracker port 8032 instead of 8050.   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-31-2017
	
		
		09:16 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 During the Ambari installation, if all the hosts already integrated with AD via SSSD, and all the service accounts already available in AD, will Ambari still try to create those accounts locally? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ambari
			
    
	
		
		
		03-02-2017
	
		
		09:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks Robert for the information.  configs.py helped, I agree it would be nice if we could edit KDC, DN, LDAP URL & etc. in Ambari. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-02-2017
	
		
		08:06 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Once a cluster is Kerberized through the Ambari Kerberos Wizard, how can I review or potentially make changes to the settings, such as KDC server, LDAP URL, admin principal & etc.,  at a later time?  It appears the only option is to disable Kerberos and go through the Kerberos Wizard again.   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ambari
			
    
	
		
		
		03-01-2017
	
		
		03:30 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks Sunile, I am familiar with all the ports. The issue is the BI tool, in this case SAS, can ping the local Windows machine, but not the SandBox running on it. I would assume we will need to make some changes to the hosts file so the SandBox IP can be exposed. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-28-2017
	
		
		08:03 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 How to connect a BI tool, SAS for instance, to a HiveServer2 instance running on a HDP 2.5 SandBox?  I assume it involves IP parsing?  Thanks. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hive
			
    
	
		
		
		02-26-2017
	
		
		07:37 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 This might be something you are interested.  As Sunile pointed out, you might use NiFi to get data loaded into your Hadoop cluster, then use NiFi ExecuteScript processor or create a custom NiFi processor to launch your Oozie workflow job.  Think this way, use NiFi to get data from sources outside of your Hadoop cluster, then use Falcon processes or Oozie workflow jobs to handle work scheduling inside of cluster. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		- « Previous
- Next »
 
        













