Member since 
    
	
		
		
		09-29-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                286
            
            
                Posts
            
        
                601
            
            
                Kudos Received
            
        
                60
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 12846 | 03-21-2017 07:34 PM | |
| 3766 | 11-16-2016 04:18 AM | |
| 2142 | 10-18-2016 03:57 PM | |
| 5100 | 09-12-2016 03:36 PM | |
| 8440 | 08-25-2016 09:01 PM | 
			
    
	
		
		
		08-25-2016
	
		
		02:16 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 I am not able to install the HAWQ Standby master on an AWS cluster running HDP 2.4.2 and Ambari 2.2.2
Here is the error:
"This can be run only on master or standby host"      Not sure what that means.
It is not being installed on a DN, with PFX installed.
It is not bing installed on the Ambari node.
I am using the Name Node (since I only have 3 HDP master nodes)  to install the HAWQ Standby Master.
I attempted to remove the HAWQ Standby (that does not start) from the Name node and placed it on another node just to test.  It gives the same error.
So right now I am just running without a standby master.
See 
      How do I begin trouble shooting this? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ambari
			
    
	
		
		
		08-15-2016
	
		
		01:16 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 It should point to https://network.pivotal.io/products/pivotal-hdb 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-12-2016
	
		
		01:11 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Here is also a good article: 
  https://community.hortonworks.com/articles/22756/quickly-enable-ssl-encryption-for-hadoop-component.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-11-2016
	
		
		03:37 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Double check your /etc/hosts file.  Double check your DNS. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-26-2016
	
		
		03:40 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I ssume also tha Hive with PAM authentication will also be a viable option on Azure.    https://community.hortonworks.com/articles/591/using-hive-with-pam-authentication.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-24-2016
	
		
		02:41 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Using Centrify with AD handles this 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-24-2016
	
		
		01:50 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Ranger and Knox are NOT LDAP server.  Use AD, Open LDAP or Free IPA.  Ranger is ONLY for authorization NOT authentication  Here are you authentication options for Hive
    However if you decide to enable Kerberos, then Hive  authentication option is no longer LDAP directly but with Kerberos (and LDAP indiectly)  The Githubs given above are with Kerberos against FreeIPA or OpenLDAP (and for one node)  
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-23-2016
	
		
		07:55 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Brandon Wilson Thank you this helps.  Yep please provide more detail if you can.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-23-2016
	
		
		07:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 We are looking for re-assurance that our data sitting in Hive right now is protected from the outside world. We set up SQL workbench on our local PCs and connect to it like shown below. However, it doesn’t matter what we put in for Username & Password, it still let’s us connect to our Hive data which is concerning us for security reasons.  
 How can I be assured us that if some hacker out in the world came across our external IP (xx.xx.xx.xxxx), they wouldn’t be able to access our data in Hive?      I cannot set up AD auth from Azure since I cannot access my corporate AD, so LDAP Authentication is not possible, and I do not want to set up a MIT KDC  Should I:   Leave Hive Authentication to None but apply SQL Standard Authorizations (See 
https://community.hortonworks.com/questions/22086/can-we-enable-hive-acls-without-ranger.html
 and  
https://cwiki.apache.org/confluence/display/Hive/SQL+Standard+Based+Hive+Authorization)  Should I set up Ranger instead of SQL Standard Auth?   With either above would this ensue that if someone log in as hive/ hive, the tables are still secured with the appropriate authorizations? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hive
			
    
	
		
		
		05-20-2016
	
		
		05:23 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 See also https://github.com/steveloughran/kerberos_and_hadoop/blob/master/sections/errors.md 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
         
					
				













