Member since 
    
	
		
		
		02-26-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                15
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		08-16-2018
	
		
		06:43 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi, we have been using a HDI instance with spark 2.2.  In this instance we are loading data from spark into a bucketed hive table.  We recently looked at moving to HDP 2.6 on cloudbreak but cant get the same code working due to the error "is bucketed but Spark currently does NOT populate bucketed output which is compatible with Hive".  Is there a way to enable this functionality? and if not is there a reason it works on HDI spark 2.2? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		08-03-2018
	
		
		10:46 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Just figured it out.  I had previously filled in the basic section and it seems to conflict if you dont clear it when moving to the advanced configuration.  I have cleared basic and the configuration has started.  Thank you for your help and very prompt responses 🙂 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-03-2018
	
		
		10:38 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Just added those and getting "Kerberos configuration contains inconsistent parameters" 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-03-2018
	
		
		10:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 	I have tried the following in a few different ways.  Removing the kerberos-env and just using properties.  I have also tried getting the kerberos-descriptor from the api and using that.  I get the message "The descriptor must be a valid JSON with the required fields Kerberos configuration contains inconsistent parameters" with the below code.  {
"kerberos-env":{
    "properties" : {
        "password_min_uppercase_letters" : "1",
        "password_min_whitespace" : "0",
        "password_min_punctuation" : "1",
        "manage_auth_to_local" : "true",
        "password_min_digits" : "1",
        "set_password_expiry" : "false",
        "encryption_types" : "aes des3-cbc-sha1 rc4 des-cbc-md5",
        "kdc_create_attributes" : "",
        "create_ambari_principal" : "true",
        "password_min_lowercase_letters" : "1",
        "password_length" : "20",
        "case_insensitive_username_rules" : "true",
        "manage_identities" : "true",
        "password_chat_timeout" : "5",
        "ad_create_attributes_template" : "\n{\n  \"objectClass\": [\"top\", \"person\", \"organizationalPerson\", \"user\"],\n  \"cn\": \"$principal_digest_256\",\n  #if( $is_service )\n  \"servicePrincipalName\": \"$principal_name\",\n  #end\n  \"userPrincipalName\": \"$normalized_principal\",\n  \"unicodePwd\": \"$password\",\n  \"accountExpires\": \"0\",\n  \"userAccountControl\": \"66048\"\n}",
        "preconfigure_services" : "DEFAULT",
        "install_packages" : "true",
        "ldap_url" : "ldaps://system.example.com:636",
        "executable_search_paths" : "/usr/bin, /usr/kerberos/bin, /usr/sbin, /usr/lib/mit/bin, /usr/lib/mit/sbin",
        "group" : "ambari-managed-principals",
        "kdc_type": "active-directory"
      }
    }
}
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-03-2018
	
		
		09:34 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I'm trying to add some advanced kerberos options within cloudbreak and am stuck on the format of the kerberos-env json descriptor.  I have tried a few things and keep getting "The descriptor must be a valid JSON with the required fields"  Can anyone advise of the format that should be used? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Hortonworks Cloudbreak
			
    
	
		
		
		08-03-2018
	
		
		03:02 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,  I am setting up a Kerberized cluster with Cloudbreak 2.7 on Azure.  We have created a cluster install using simple options and it all works well, however, once we come to installing a cluster using Kerberos we are running into an error when creating the principals.  Failed to create the account for HTTP/hostname.guid.px.internal.cloudapp.net@EXAMPLE.COM  it seems that the principal is too long to fit into the 64 char limit that we have in Active Directory due to the hostname being too long.    My questions are   Is there a way around this issue?  Has anyone else managed to setup kerberos on Azure using Active Directory, if so how?  Thanks for any help that can be provided. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Hortonworks Cloudbreak
			
    
	
		
		
		08-02-2018
	
		
		02:40 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I am having the same issue with a kerberized cluster created through cloudbreak 2.7.  Did you manage to find a workaround the fqdn length? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-01-2018
	
		
		07:09 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I'm having this same problem. I recently move our cluster to Ubuntu.  When using the previous Centos it was working fine.  I have tried the case conversion options with no luck.  I can however access everything if I add the user to ranger and not the group. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		